TRAM: Global Trajectory and Motion of 3D Humans from in-the-wild Videos

TRAM: Global Trajectory and Motion of 3D Humans from in-the-wild Videos

2 Sep 2024 | Yufu Wang, Ziyun Wang, Lingjie Liu, and Kostas Daniilidis
TRAM is a two-stage method proposed by Yufu Wang, Ziyun Wang, Lingjie Liu, and Kostas Daniilidis to reconstruct the global trajectory and motion of 3D humans from in-the-wild videos. The first stage robustifies SLAM to recover the camera motion in the presence of dynamic humans and uses scene background to derive the motion scale. The second stage introduces VIMO (Video Transformer for Human Motion), a video transformer model that regresses the kinematic body motion of a human. By composing these two motions, TRAM achieves accurate recovery of 3D human motion in the world space, significantly reducing global motion errors compared to prior work. The method leverages a large pre-trained model, HMR2.0, and fine-tunes it on video data to achieve state-of-the-art reconstruction accuracy. TRAM addresses the challenges of dynamic humans and complex scenes, providing a robust and scalable solution for 3D human motion estimation.TRAM is a two-stage method proposed by Yufu Wang, Ziyun Wang, Lingjie Liu, and Kostas Daniilidis to reconstruct the global trajectory and motion of 3D humans from in-the-wild videos. The first stage robustifies SLAM to recover the camera motion in the presence of dynamic humans and uses scene background to derive the motion scale. The second stage introduces VIMO (Video Transformer for Human Motion), a video transformer model that regresses the kinematic body motion of a human. By composing these two motions, TRAM achieves accurate recovery of 3D human motion in the world space, significantly reducing global motion errors compared to prior work. The method leverages a large pre-trained model, HMR2.0, and fine-tunes it on video data to achieve state-of-the-art reconstruction accuracy. TRAM addresses the challenges of dynamic humans and complex scenes, providing a robust and scalable solution for 3D human motion estimation.
Reach us at info@study.space