AnyRotate: Gravity-Invariant In-Hand Object Rotation with Sim-to-Real Touch

AnyRotate: Gravity-Invariant In-Hand Object Rotation with Sim-to-Real Touch

3 Nov 2024 | Max Yang¹, Chenghua Lu¹, Alex Church², Yijiong Lin¹, Chris Ford¹, Haoran Li¹, Efi Psomopoulou¹, David A.W. Barton¹*, Nathan F. Lepora¹*
This paper presents AnyRotate, a system for gravity-invariant multi-axis in-hand object rotation using dense featured sim-to-real touch. The system leverages sim-to-real reinforcement learning and rich tactile sensing to train a unified policy for rotating objects about any desired axis in any hand orientation. The key contributions include: (1) an RL formulation using auxiliary goals for learning a unified policy for in-hand object rotation about any desired axis for any hand orientation relative to gravity; (2) a dense tactile representation, consisting of contact pose and contact force, for learning in-hand manipulation in simulation; and (3) an approach to achieve zero-shot sim-to-real tactile policy transfer, validated on 10 diverse objects in the real world. The system uses a 16-DoF tactile robot hand attached to a UR5 to provide rich tactile feedback for performing stable in-hand object rotation. The results show that the dense tactile policy demonstrates strong robustness across various hand directions and rotation axes and maintains high performance when deployed on a rotating hand. The system also highlights the benefit of capturing detailed contact information when handling objects of varying properties. The paper also discusses the challenges of in-hand manipulation, including the need for high-precision control of secure grasps in the presence of gravity and the importance of rich tactile sensing for dexterous manipulation. The results show that the proposed system outperforms policies using simpler tactile representations and demonstrates the effectiveness of sim-to-real tactile policy transfer.This paper presents AnyRotate, a system for gravity-invariant multi-axis in-hand object rotation using dense featured sim-to-real touch. The system leverages sim-to-real reinforcement learning and rich tactile sensing to train a unified policy for rotating objects about any desired axis in any hand orientation. The key contributions include: (1) an RL formulation using auxiliary goals for learning a unified policy for in-hand object rotation about any desired axis for any hand orientation relative to gravity; (2) a dense tactile representation, consisting of contact pose and contact force, for learning in-hand manipulation in simulation; and (3) an approach to achieve zero-shot sim-to-real tactile policy transfer, validated on 10 diverse objects in the real world. The system uses a 16-DoF tactile robot hand attached to a UR5 to provide rich tactile feedback for performing stable in-hand object rotation. The results show that the dense tactile policy demonstrates strong robustness across various hand directions and rotation axes and maintains high performance when deployed on a rotating hand. The system also highlights the benefit of capturing detailed contact information when handling objects of varying properties. The paper also discusses the challenges of in-hand manipulation, including the need for high-precision control of secure grasps in the presence of gravity and the importance of rich tactile sensing for dexterous manipulation. The results show that the proposed system outperforms policies using simpler tactile representations and demonstrates the effectiveness of sim-to-real tactile policy transfer.
Reach us at info@study.space
Understanding AnyRotate%3A Gravity-Invariant In-Hand Object Rotation with Sim-to-Real Touch