Imitation learning from human hand motion data presents a promising avenue for imbuing robots with human-like dexterity in real-world manipulation tasks. Despite this potential, substantial challenges persist, particularly with the portability of existing hand motion capture (mocap) systems and the complexity of translating mocap data into effective robotic policies. To tackle these issues, we introduce DexCap, a portable hand motion capture system, alongside DexIL, a novel imitation algorithm for training dexterous robot skills directly from human hand mocap data. DexCap offers precise, occlusion-resistant tracking of wrist and finger motions based on SLAM and electromagnetic field together with 3D observations of the environment. Utilizing this rich dataset, DexIL employs inverse kinematics and point cloud-based imitation learning to seamlessly replicate human actions with robot hands. Beyond direct learning from human motion, DexCap also offers an optional human-in-the-loop correction mechanism during policy rollouts to refine and further improve task performance. Through extensive evaluation across six challenging dexterous manipulation tasks, our approach not only demonstrates superior performance but also showcases the system's capability to effectively learn from in-the-wild mocap data, paving the way for future data collection methods in the pursuit of human-level robot dexterity. More details can be found at https://dex-cap.github.io
翻译:从人类手部运动数据中进行模仿学习,为在现实世界操作任务中赋予机器人类人灵巧性提供了一条有前景的途径。尽管潜力巨大,但依然存在重大挑战,尤其是现有手部动作捕捉系统的便携性,以及将动作捕捉数据转化为有效机器人策略的复杂性。为解决这些问题,我们引入了DexCap,一个便携式手部动作捕捉系统,以及DexIL,一种直接从人类手部动作捕捉数据训练灵巧机器人技能的新型模仿算法。DexCap基于SLAM和电磁场,结合环境的三维观测,提供了对腕部和手指运动的精确、抗遮挡追踪。利用这一丰富数据集,DexIL采用逆运动学和基于点云的模仿学习,无缝地复现人类动作用于机器人手。除了直接从人类运动学习外,DexCap还在策略执行期间提供了一个可选的人机协同校正机制,以优化并进一步提升任务性能。通过对六项具有挑战性的灵巧操作任务进行广泛评估,我们的方法不仅展示了卓越的性能,还证明了系统能够有效从真实场景动作捕捉数据中学习,为未来追求人类水平机器人灵巧性的数据收集方法铺平了道路。更多细节请访问 https://dex-cap.github.io。