💁Introduction

This work studies how a robot can reorient objects in-hand with limited prior knowledge about object shape. Specifically, we investigate in-hand manipulation in highly occluded settings where the robot cannot rely on vision and instead leverages tactile sensing to feel the object’s shapes and localize it in space. Our task consists of reorienting a diverse set of objects to different target orientations. To ensure that the robot does not have access to the object shape before completing the task, we avoid using an image or point cloud to specify the target orientation of the object to be manipulated. Instead, we formulate our task as a peg insertion task. The robot is given an image of a hole, and the robot must determine how to reorient an object to fit into that hole. The shape of the hole provides minimal information about the 3D-shape of the object, ensuring that our assumption that the robot is working with an unseen object holds.

🪧Demonstrations

🕵️ An exmaple for human “blind” in-hand manipulation

🤖Hardware System

Roller Grasper rolling the object in hand
Free to move joints

👮Control Policy to Reorient Object

🚇Whole Pipeline Demo

🗣️Full Presentation

💡Digital Poster

notion image
badge