When carrying out robotic manipulation tasks, objects occasionally fall as a result of the rotation caused by slippage. This can be prevented by obtaining tactile information that provides better knowledge on the physical properties of the grasping. In this paper, we estimate the rotation angle of a grasped object when slippage occurs. We implement a system made up of a neural network with which to segment the contact region and an algorithm with which to estimate the rotated angle of that region. This method is applied to DIGIT tactile sensors. Our system has additionally been trained and tested with our publicly available dataset which is, to the best of our knowledge, the first dataset related to tactile segmentation from non-synthetic images to appear in the literature, and with which we have attained results of 95% and 90% as regards Dice and IoU metrics in the worst scenario. Moreover, we have obtained a maximum error of 3 degrees when testing with objects not previously seen by our system in 45 different lifts. This, therefore, proved that our approach is able to detect the slippage movement, thus providing a possible reaction that will prevent the object from falling.
翻译:在机器人操作任务中,物体常因滑动导致的旋转而掉落。通过获取触觉信息可有效预防此类情况,因为触觉信息能提供关于抓取物理特性的更深入认知。本文旨在估计抓取物体在滑动发生时的旋转角度。我们构建了一个由神经网络(用于接触区域分割)和旋转角度估计算法组成的系统。该方法应用于DIGIT触觉传感器。系统采用公开数据集进行训练与测试,据我们所知,该数据集是文献中首个基于非合成图像的触觉分割数据集。在最差场景下,系统在Dice和IoU指标上分别达到95%和90%的准确率。此外,在45次不同抬举动作中测试系统未见过的物体时,最大角度误差仅为3度。实验证明,该方法能够有效检测滑动运动,从而为预防物体掉落提供可行的应对策略。