Humanoid robots are envisioned to adapt demonstrated motions to diverse real-world conditions while accurately preserving motion patterns. Existing motion prior approaches enable well adaptability with a few motions but often sacrifice imitation accuracy, whereas motion-tracking methods achieve accurate imitation yet require many training motions and a test-time target motion to adapt. To combine their strengths, we introduce AdaMimic, a novel motion tracking algorithm that enables adaptable humanoid control from a single reference motion. To reduce data dependence while ensuring adaptability, our method first creates an augmented dataset by sparsifying the single reference motion into keyframes and applying light editing with minimal physical assumptions. A policy is then initialized by tracking these sparse keyframes to generate dense intermediate motions, and adapters are subsequently trained to adjust tracking speed and refine low-level actions based on the adjustment, enabling flexible time warping that further improves imitation accuracy and adaptability. We validate these significant improvements in our approach in both simulation and the real-world Unitree G1 humanoid robot in multiple tasks across a wide range of adaptation conditions. Videos and code are available at https://taohuang13.github.io/adamimic.github.io/.
翻译:人形机器人被期望能够将演示动作适应于多样化的现实世界条件,同时精确保持运动模式。现有的运动先验方法能够以少量运动实现良好的适应性,但往往牺牲模仿精度;而运动跟踪方法虽然实现了精确模仿,却需要大量训练动作以及测试时的目标动作来适应。为结合两者的优势,我们提出了AdaMimic,一种新颖的运动跟踪算法,能够从单一参考动作实现自适应的人形机器人控制。为在确保适应性的同时降低数据依赖性,我们的方法首先通过将单一参考动作稀疏化为关键帧,并以最少的物理假设进行轻微编辑,创建一个增强数据集。随后,通过跟踪这些稀疏关键帧以生成密集的中间动作来初始化策略,并训练适配器来调整跟踪速度并根据调整优化底层动作,从而实现灵活的时间扭曲,进一步提升模仿精度和适应性。我们在仿真和现实世界的Unitree G1人形机器人上,通过多种任务和广泛的适应条件验证了本方法带来的显著改进。视频和代码可在https://taohuang13.github.io/adamimic.github.io/获取。