Recent advances in data-driven reinforcement learning and motion tracking have substantially improved humanoid locomotion, yet critical practical challenges remain. In particular, while low-level motion tracking and trajectory-following controllers are mature, whole-body reference-guided methods are difficult to adapt to higher-level command interfaces and diverse task contexts: they require large, high-quality datasets, are brittle across speed and pose regimes, and are sensitive to robot-specific calibration. To address these limitations, we propose the Parameterized Motion Generator (PMG), a real-time motion generator grounded in an analysis of human motion structure that synthesizes reference trajectories using only a compact set of parameterized motion data together with high-dimensional control commands. Combined with an imitation-learning pipeline and an optimization-based sim-to-real motor parameter identification module, we validate the complete approach on our humanoid prototype ZERITH Z1 and show that, within a single integrated system, PMG produces natural, human-like locomotion, responds precisely to high-dimensional control inputs-including VR-based teleoperation-and enables efficient, verifiable sim-to-real transfer. Together, these results establish a practical, experimentally validated pathway toward natural and deployable humanoid control. Website: https://pmg-icra26.github.io/
翻译:数据驱动的强化学习与运动追踪技术的最新进展显著提升了仿人机器人的运动能力,但关键的实际挑战依然存在。具体而言,尽管低层级的运动追踪与轨迹跟踪控制器已较为成熟,但全身参考引导方法难以适应高层级指令接口与多样化任务场景:它们需要大规模高质量数据集,在不同速度与姿态状态下表现脆弱,且对机器人特定标定参数敏感。为应对这些局限,我们提出参数化运动生成器(PMG),这是一种基于人体运动结构分析、可实时生成参考轨迹的运动生成器,其仅需使用一组紧凑的参数化运动数据结合高维控制指令即可合成轨迹。结合模仿学习流程与基于优化的仿真到现实电机参数辨识模块,我们在仿人机器人原型机ZERITH Z1上验证了完整方案,结果表明:在单一集成系统中,PMG能够生成自然、类人的运动步态,精准响应高维控制输入(包括基于VR的遥操作),并实现高效、可验证的仿真到现实迁移。这些成果共同为自然且可部署的仿人机器人控制建立了一条经过实验验证的实用化路径。项目网站:https://pmg-icra26.github.io/