Teleoperation for robot imitation learning is bottlenecked by hardware availability. Can high-quality robot data be collected without a physical robot? We present a system for augmenting Apple Vision Pro with real-time virtual robot feedback. By providing users with an intuitive understanding of how their actions translate to robot motions, we enable the collection of natural barehanded human data that is compatible with the limitations of physical robot hardware. We conducted a user study with 15 participants demonstrating 3 different tasks each under 3 different feedback conditions and directly replayed the collected trajectories on physical robot hardware. Results suggest live robot feedback dramatically improves the quality of the collected data, suggesting a new avenue for scalable human data collection without access to robot hardware. Videos and more are available at https://nataliya.dev/armada.
翻译:机器人模仿学习的遥操作受限于硬件可用性。能否在不使用物理机器人的情况下采集高质量的机器人数据?我们提出了一种为Apple Vision Pro提供实时虚拟机器人反馈的增强系统。通过使用户直观理解其动作如何转化为机器人运动,我们能够采集与物理机器人硬件限制兼容的自然徒手人类数据。我们开展了包含15名参与者的用户研究,每人在3种不同反馈条件下演示3项不同任务,并直接在物理机器人硬件上回放采集的轨迹。结果表明实时机器人反馈显著提升了采集数据的质量,这为无需机器人硬件即可实现可扩展的人类数据采集提供了新途径。视频及相关资料详见https://nataliya.dev/armada。