In this project, we design a system based on a Tactile-Enabled Roller Grasper to achieve in-hand manipulation without access to object shape prior or vision information. To localize and reconstruct object with noisy and narrow tactile sensor data, we introduce tactile SLAM to achieve object pose estimation and shape reconstruction. To strategically tradeoff exploration of the global object shape with efficient task completion, we introduce Bayesian optimization to achieve task-driven exploration.
Contribution: In this project, I mainly contribute to the Tactile SLAM and BO-based exploration in the simulation.
This work studies how a robot can reorient objects in-hand with limited prior knowledge about object shape. Specifically, we investigate in-hand manipulation in highly occluded settings where the robot cannot rely on vision and instead leverages tactile sensing to feel the object’s shapes and localize it in space. Our task consists of reorienting a diverse set of objects to different target orientations. To ensure that the robot does not have access to the object shape before completing the task, we avoid using an image or point cloud to specify the target orientation of the object to be manipulated. Instead, we formulate our task as a peg insertion task. The robot is given an image of a hole, and the robot must determine how to reorient an object to fit into that hole. The shape of the hole provides minimal information about the 3D-shape of the object, ensuring that our assumption that the robot is working with an unseen object holds.