We present a method for controlling a simulated humanoid to grasp an object and move it to follow an object trajectory. Due to the challenges in controlling a humanoid with dexterous hands, prior methods often use a disembodied hand and only consider vertical lifts or short trajectories. This limited scope hampers their applicability for object manipulation required for animation and simulation. To close this gap, we learn a controller that can pick up a large number (>1200) of objects and carry them to follow randomly generated trajectories. Our key insight is to leverage a humanoid motion representation that provides human-like motor skills and significantly speeds up training. Using only simplistic reward, state, and object representations, our method shows favorable scalability on diverse object and trajectories. For training, we do not need dataset of paired full-body motion and object trajectories. At test time, we only require the object mesh and desired trajectories for grasping and transporting. To demonstrate the capabilities of our method, we show state-of-the-art success rates in following object trajectories and generalizing to unseen objects. Code and models will be released.
翻译:本文提出了一种控制模拟人形机器人抓取物体并沿指定轨迹移动物体的方法。由于控制具有灵巧双手的人形机器人存在挑战,先前的方法通常使用脱离身体的手部模型,且仅考虑垂直抓取或短距离轨迹。这种有限的适用范围阻碍了其在动画与仿真所需物体操控任务中的应用。为弥补这一差距,我们学习了一种控制器,能够拾取大量(>1200个)物体并携带它们沿随机生成的轨迹运动。我们的核心洞见在于利用一种人形运动表征,该表征提供了类人的运动技能并显著加速了训练过程。仅使用简化的奖励函数、状态和物体表征,我们的方法在多样化物体和轨迹上展现出良好的可扩展性。训练过程中,我们不需要配对全身运动与物体轨迹的数据集。在测试阶段,仅需提供物体网格模型及抓取搬运的目标轨迹。为展示本方法的性能,我们在物体轨迹跟随及泛化至未见物体方面取得了最先进的成功率。代码与模型将公开发布。