G-HOP: Generative Hand-Object Prior for Interaction Reconstruction and Grasp Synthesis

G-HOP: Generative Hand-Object Prior for Interaction Reconstruction and Grasp Synthesis

18 Apr 2024 | Yufei Ye1 Abhinav Gupta1 Kris Kitani1,2 Shubham Tulsiani1
G-HOP is a generative model that learns to model hand-object interactions (HOI) in 3D, allowing for the synthesis of plausible interactions across a wide variety of objects. The model uses a denoising diffusion process to capture the joint distribution of both the 3D object and a human hand, conditioned on the object category. By representing the hand as a skeletal distance field, the model can effectively reason about spatial interactions and generate realistic HOI configurations. The learned generative prior can be used to guide inference tasks such as reconstructing interaction clips from monocular video and synthesizing plausible human grasps given an object mesh. Empirical evaluations demonstrate that G-HOP outperforms current task-specific baselines in both video-based reconstruction and human grasp synthesis, showing its effectiveness in generating diverse and plausible HOI interactions.G-HOP is a generative model that learns to model hand-object interactions (HOI) in 3D, allowing for the synthesis of plausible interactions across a wide variety of objects. The model uses a denoising diffusion process to capture the joint distribution of both the 3D object and a human hand, conditioned on the object category. By representing the hand as a skeletal distance field, the model can effectively reason about spatial interactions and generate realistic HOI configurations. The learned generative prior can be used to guide inference tasks such as reconstructing interaction clips from monocular video and synthesizing plausible human grasps given an object mesh. Empirical evaluations demonstrate that G-HOP outperforms current task-specific baselines in both video-based reconstruction and human grasp synthesis, showing its effectiveness in generating diverse and plausible HOI interactions.
Reach us at info@study.space