We introduce a novel system for human-to-robot trajectory transfer that enables robots to manipulate objects by learning from human demonstration videos. The system consists of four modules. The first module is a data collection module that is designed to collect human demonstration videos from the point of view of a robot using an AR headset. The second module is a video understanding module that detects objects and extracts 3D human-hand trajectories from demonstration videos. The third module transfers a human-hand trajectory into a reference trajectory of a robot end-effector in 3D space. The last module utilizes a trajectory optimization algorithm to solve a trajectory in the robot configuration space that can follow the end-effector trajectory transferred from the human demonstration. Consequently, these modules enable a robot to watch a human demonstration video once and then repeat the same mobile manipulation task in different environments, even when objects are placed differently from the demonstrations. Experiments of different manipulation tasks are conducted on a mobile manipulator to verify the effectiveness of our system
翻译:本文提出一种新颖的人机轨迹迁移系统,使机器人能够通过人类演示视频学习物体操作。该系统包含四个模块:第一模块为数据采集模块,通过AR头显从机器人视角采集人类演示视频;第二模块为视频理解模块,从演示视频中检测物体并提取三维人手轨迹;第三模块将人手轨迹转换为机器人末端执行器的三维空间参考轨迹;第四模块采用轨迹优化算法,在机器人构型空间中求解能够跟随人类演示迁移所得末端轨迹的运动轨迹。这些模块使机器人仅需观看一次人类演示视频,即可在不同环境中复现相同的移动操作任务,即使物体摆放位置与演示时存在差异。我们在移动操作机器人上开展了多种操作任务的实验,验证了该系统的有效性。