Recent advances in data-driven reinforcement learning and motion tracking have substantially improved humanoid locomotion, yet critical practical challenges remain. In particular, while low-level motion tracking and trajectory-following controllers are mature, whole-body reference-guided methods are difficult to adapt to higher-level command interfaces and diverse task contexts: they require large, high-quality datasets, are brittle across speed and pose regimes, and are sensitive to robot-specific calibration. To address these limitations, we propose the Parameterized Motion Generator (PMG), a real-time motion generator grounded in an analysis of human motion structure that synthesizes reference trajectories using only a compact set of parameterized motion data together with High-dimensional control commands. Combined with an imitation-learning pipeline and an optimization-based sim-to-real motor parameter identification module, we validate the complete approach on our humanoid prototype ZERITH Z1 and show that, within a single integrated system, PMG produces natural, human-like locomotion, responds precisely to high-dimensional control inputs-including VR-based teleoperation-and enables efficient, verifiable sim-to-real transfer. Together, these results establish a practical, experimentally validated pathway toward natural and deployable humanoid control.
翻译:近年来,数据驱动的强化学习与运动追踪技术显著提升了人形机器人的运动能力,但仍存在若干关键性实践挑战。具体而言,尽管底层运动追踪与轨迹跟踪控制器已较为成熟,但全身参考引导方法难以适配高层指令接口与多样化任务场景:这类方法需要大规模高质量数据集,在不同速度与姿态状态下表现脆弱,且对机器人特定标定参数极为敏感。为突破这些局限,我们提出参数化运动生成器(PMG)——一种基于人体运动结构分析、仅需紧凑参数化运动数据与高维控制指令即可合成参考轨迹的实时运动生成器。结合模仿学习框架与基于优化的仿真到现实电机参数辨识模块,我们在人形机器人原型机ZERITH Z1上验证了完整系统。实验表明,在单一集成系统中,PMG能够生成自然类人的运动姿态,精准响应包括基于VR的遥操作在内的高维控制指令,并实现高效可验证的仿真到现实迁移。这些成果共同为自然化、可部署的人形机器人控制建立了经过实验验证的实用技术路径。