We introduce GEOTACT, a robotic manipulation method capable of retrieving objects buried in granular media. This is a challenging task due to the need to interact with granular media, and doing so based exclusively on tactile feedback, since a buried object can be completely hidden from vision. Tactile feedback is in itself challenging in this context, due to ubiquitous contact with the surrounding media, and the inherent noise level induced by the tactile readings. To address these challenges, we use a learning method trained end-to-end with simulated sensor noise. We show that our problem formulation leads to the natural emergence of learned pushing behaviors that the manipulator uses to reduce uncertainty and funnel the object to a stable grasp despite spurious and noisy tactile readings. We also introduce a training curriculum that enables learning these behaviors in simulation, followed by zero-shot transfer to real hardware. To the best of our knowledge, GEOTACT is the first method to reliably retrieve a number of different objects from a granular environment, doing so on real hardware and with integrated tactile sensing. Videos and additional information can be found at https://jxu.ai/geotact.
翻译:我们提出GEOTACT,一种能从颗粒介质中检索埋藏物体的机器人操作方法。由于需要与颗粒介质交互,且必须完全依赖触觉反馈(因为埋藏物体完全不可见),这是一项具有挑战性的任务。在该场景下,触觉反馈本身也充满困难:既要应对与周围介质的普遍接触,又要处理触觉读数固有的噪声水平。为应对这些挑战,我们采用了一种通过模拟传感器噪声进行端到端训练的学习方法。研究表明,我们的问题公式能自然涌现出机械臂习得的推搡行为——尽管存在虚假和噪声触觉读数,该行为仍能降低不确定性并将物体引导至稳定抓取状态。我们还设计了训练课程,使得这些行为能在仿真环境中习得,并实现零样本迁移至真实硬件。据我们所知,GEOTACT是首个能在真实硬件上集成触觉感知、从颗粒环境中可靠检索多种物体的方法。视频及更多信息请见https://jxu.ai/geotact。