24 Apr 2024 | Qinxi Yu*,1, Masoud Moghani*,1, Karthik Dharmarajan2, Vincent Schorp2,3, William Chung-Ho Panitch2, Jingzhou Liu1,5, Kush Hari2, Huang Huang2, Mayank Mittal3,5, Ken Goldberg2, Animesh Garg1,4,5
ORBIT-Surgical is an open-source, physics-based surgical robot simulation framework designed to accelerate progress in robot learning for surgical applications. The framework, built on NVIDIA Isaac Sim, offers high-fidelity rendering, GPU-accelerated physics, and rich object models, supporting both rigid and deformable materials. It provides 14 benchmark tasks for the da Vinci Research Kit (dVRK) and Smart Tissue Autonomous Robot (STAR), covering common surgical subtasks. ORBIT-Surgical supports various robot learning workflows, including reinforcement learning (RL) and imitation learning (IL), teleoperation, and synthetic data generation. The framework enables efficient data collection and fast learning, with parallelized GPU simulation and rich sensory elements. Experimental results demonstrate the effectiveness of ORBIT-Surgical in training RL policies for surgical tasks, the benefits of expert demonstrations for long-horizon tasks, and the successful transfer of learned policies to a physical dVRK robot. The framework also facilitates synthetic data generation for improved perception performance and showcases the potential for augmented surgical dexterity.ORBIT-Surgical is an open-source, physics-based surgical robot simulation framework designed to accelerate progress in robot learning for surgical applications. The framework, built on NVIDIA Isaac Sim, offers high-fidelity rendering, GPU-accelerated physics, and rich object models, supporting both rigid and deformable materials. It provides 14 benchmark tasks for the da Vinci Research Kit (dVRK) and Smart Tissue Autonomous Robot (STAR), covering common surgical subtasks. ORBIT-Surgical supports various robot learning workflows, including reinforcement learning (RL) and imitation learning (IL), teleoperation, and synthetic data generation. The framework enables efficient data collection and fast learning, with parallelized GPU simulation and rich sensory elements. Experimental results demonstrate the effectiveness of ORBIT-Surgical in training RL policies for surgical tasks, the benefits of expert demonstrations for long-horizon tasks, and the successful transfer of learned policies to a physical dVRK robot. The framework also facilitates synthetic data generation for improved perception performance and showcases the potential for augmented surgical dexterity.