MotionLCM: Real-time Controllable Motion Generation via Latent Consistency Model

MotionLCM: Real-time Controllable Motion Generation via Latent Consistency Model

30 Apr 2024 | Wenxun Dai1, Ling-Hao Chen1, Jingbo Wang2*, Jinpeng Liu1, Bo Dai2*, Yansong Tang1
This paper introduces MotionLCM, a real-time controllable motion generation model that achieves high-quality text-to-motion synthesis and precise motion control in approximately 30 milliseconds. The model is built on the latent diffusion model (MLD) and employs one-step or few-step inference to improve runtime efficiency. To enable effective controllability, a motion ControlNet is incorporated into the latent space of MotionLCM, allowing explicit control signals (e.g., pelvis trajectory) to influence the generation process directly. Experimental results demonstrate that MotionLCM can generate high-quality human motions with text and control signals in real-time, maintaining a balance between generation quality and efficiency. The key contributions of the paper include the introduction of MotionLCM, the development of a motion ControlNet for controllable motion generation, and extensive experimental validation showing the effectiveness of the proposed method.This paper introduces MotionLCM, a real-time controllable motion generation model that achieves high-quality text-to-motion synthesis and precise motion control in approximately 30 milliseconds. The model is built on the latent diffusion model (MLD) and employs one-step or few-step inference to improve runtime efficiency. To enable effective controllability, a motion ControlNet is incorporated into the latent space of MotionLCM, allowing explicit control signals (e.g., pelvis trajectory) to influence the generation process directly. Experimental results demonstrate that MotionLCM can generate high-quality human motions with text and control signals in real-time, maintaining a balance between generation quality and efficiency. The key contributions of the paper include the introduction of MotionLCM, the development of a motion ControlNet for controllable motion generation, and extensive experimental validation showing the effectiveness of the proposed method.
Reach us at info@study.space
[slides and audio] MotionLCM%3A Real-time Controllable Motion Generation via Latent Consistency Model