Functional grasping is essential for humans to perform specific tasks, such as grasping scissors by the finger holes to cut materials or by the blade to safely hand them over. Enabling dexterous robot hands with functional grasping capabilities is crucial for their deployment to accomplish diverse real-world tasks. Recent research in dexterous grasping, however, often focuses on power grasps while overlooking task- and object-specific functional grasping poses. In this paper, we introduce FunGrasp, a system that enables functional dexterous grasping across various robot hands and performs one-shot transfer to unseen objects. Given a single RGBD image of functional human grasping, our system estimates the hand pose and transfers it to different robotic hands via a human-to-robot (H2R) grasp retargeting module. Guided by the retargeted grasping poses, a policy is trained through reinforcement learning in simulation for dynamic grasping control. To achieve robust sim-to-real transfer, we employ several techniques including privileged learning, system identification, domain randomization, and gravity compensation. In our experiments, we demonstrate that our system enables diverse functional grasping of unseen objects using single RGBD images, and can be successfully deployed across various dexterous robot hands. The significance of the components is validated through comprehensive ablation studies. Project page: https://hly-123.github.io/FunGrasp/ .
翻译:功能性抓取对于人类执行特定任务至关重要,例如通过剪刀的指孔抓取以剪切材料,或通过刀片抓取以安全地递送。赋予灵巧机器人手功能性抓取能力,对于其部署以完成多样化的现实世界任务至关重要。然而,近期关于灵巧抓取的研究通常侧重于强力抓取,而忽视了任务与物体特定的功能性抓取姿态。本文中,我们介绍了FunGrasp,一个能够在多种机器人手上实现功能性灵巧抓取,并对未见物体进行一次性迁移的系统。给定功能性人手抓取的单张RGBD图像,我们的系统通过一个人到机器人(H2R)抓取重定向模块,估计手部姿态并将其迁移到不同的机器人手上。在重定向抓取姿态的引导下,通过仿真中的强化学习训练出一个动态抓取控制策略。为实现稳健的仿真到现实迁移,我们采用了包括特权学习、系统辨识、领域随机化和重力补偿在内的多种技术。在我们的实验中,我们证明了我们的系统能够使用单张RGBD图像对未见物体实现多样化的功能性抓取,并且可以成功部署到多种灵巧机器人手上。各组件的重要性通过全面的消融研究得到了验证。项目页面:https://hly-123.github.io/FunGrasp/。