Mirroring the complex structures and diverse functions of natural organisms is a long-standing challenge in robotics. Modern fabrication techniques have dramatically expanded feasible hardware, yet deploying these systems requires control software to translate desired motions into actuator commands. While conventional robots can easily be modeled as rigid links connected via joints, it remains an open challenge to model and control bio-inspired robots that are often multi-material or soft, lack sensing capabilities, and may change their material properties with use. Here, we introduce Neural Jacobian Fields, an architecture that autonomously learns to model and control robots from vision alone. Our approach makes no assumptions about the robot's materials, actuation, or sensing, requires only a single camera for control, and learns to control the robot without expert intervention by observing the execution of random commands. We demonstrate our method on a diverse set of robot manipulators, varying in actuation, materials, fabrication, and cost. Our approach achieves accurate closed-loop control and recovers the causal dynamic structure of each robot. By enabling robot control with a generic camera as the only sensor, we anticipate our work will dramatically broaden the design space of robotic systems and serve as a starting point for lowering the barrier to robotic automation.
翻译:模拟自然生物复杂结构与多样功能是机器人学中长期存在的挑战。现代制造技术极大拓展了可行硬件范围,但这些系统的部署仍需控制软件将期望运动转化为执行器指令。虽然传统机器人可简单建模为通过关节连接的刚性连杆,但对仿生机器人的建模与控制仍是开放难题——这类机器人常采用多材料或软体结构,缺乏传感能力,且材料特性可能随使用改变。本文提出神经雅可比场架构,该架构仅通过视觉即可自主学习机器人建模与控制。我们的方法无需对机器人材料、驱动方式或传感机制作任何假设,仅需单个摄像头即可实现控制,并通过观察随机指令的执行过程自主学习机器人操控。我们在驱动方式、材料、制造工艺和成本各异的多种机器人操作器上验证了本方法。该方案实现了精确的闭环控制,并能还原各机器人的因果动态结构。通过使用通用摄像头作为唯一传感器实现机器人控制,我们预期这项工作将极大拓展机器人系统的设计空间,并为降低机器人自动化门槛提供起点。