Tactile exploration plays a crucial role in understanding object structures for fundamental robotics tasks such as grasping and manipulation. However, efficiently exploring such objects using tactile sensors is challenging, primarily due to the large-scale unknown environments and limited sensing coverage of these sensors. To this end, we present AcTExplore, an active tactile exploration method driven by reinforcement learning for object reconstruction at scales that automatically explores the object surfaces in a limited number of steps. Through sufficient exploration, our algorithm incrementally collects tactile data and reconstructs 3D shapes of the objects as well, which can serve as a representation for higher-level downstream tasks. Our method achieves an average of 95.97% IoU coverage on unseen YCB objects while just being trained on primitive shapes. Project Webpage: https://prg.cs.umd.edu/AcTExplore
翻译:触觉探索在理解物体结构方面发挥着关键作用,这对于抓取和操作等基础机器人任务至关重要。然而,利用触觉传感器高效地探索此类物体具有挑战性,这主要归因于大规模未知环境以及这些传感器有限的感知覆盖范围。为此,我们提出了AcTExplore,一种由强化学习驱动的主动触觉探索方法,用于在有限步骤内自动探索物体表面并进行大规模物体重建。通过充分的探索,我们的算法逐步收集触觉数据并同时重建物体的三维形状,该形状可作为更高层次下游任务的表征。我们的方法仅在基础形状上进行训练,即在未见过的YCB物体上实现了平均95.97%的交并比覆盖度。项目网页:https://prg.cs.umd.edu/AcTExplore