This paper enables real-world humanoid robots to maintain stability while performing expressive motions like humans do. We propose ExBody2, a generalized whole-body tracking framework that can take any reference motion inputs and control the humanoid to mimic the motion. The model is trained in simulation with Reinforcement Learning and then transferred to the real world. It decouples keypoint tracking with velocity control, and effectively leverages a privileged teacher policy to distill precise mimic skills into the target student policy, which enables high-fidelity replication of dynamic movements such as running, crouching, dancing, and other challenging motions. We present a comprehensive qualitative and quantitative analysis of crucial design factors in the paper. We conduct our experiments on two humanoid platforms and demonstrate the superiority of our approach against state-of-the-arts, providing practical guidelines to pursue the extreme of whole-body control for humanoid robots.
翻译:本文旨在使现实世界中的仿人机器人能够像人类一样在保持稳定性的同时执行富有表现力的动作。我们提出了ExBody2,一种通用的全身跟踪框架,能够接收任意参考运动输入并控制仿人机器人模仿该运动。该模型通过强化学习在仿真环境中训练,随后迁移至现实世界。该框架将关键点跟踪与速度控制解耦,并有效利用特权教师策略将精确的模仿技能提炼至目标学生策略中,从而实现对奔跑、蹲伏、舞蹈及其他高难度动作等动态运动的高保真复现。本文对关键设计要素进行了全面的定性与定量分析。我们在两个仿人机器人平台上进行了实验,证明了本方法相对于现有先进技术的优越性,并为追求仿人机器人全身控制的极限提供了实用指导。