The presence of post-stroke grasping deficiencies highlights the critical need for the development and implementation of advanced compensatory strategies. This paper introduces a novel system to aid chronic stroke survivors through the development of a soft, vision-based, tactile-enabled extra robotic finger. By incorporating vision-based tactile sensing, the system autonomously adjusts grip force in response to slippage detection. This synergy not only ensures mechanical stability but also enriches tactile feedback, mimicking the dynamics of human-object interactions. At the core of our approach is a transformer-based framework trained on a comprehensive tactile dataset encompassing objects with a wide range of morphological properties, including variations in shape, size, weight, texture, and hardness. Furthermore, we validated the system's robustness in real-world applications, where it successfully manipulated various everyday objects. The promising results highlight the potential of this approach to improve the quality of life for stroke survivors.
翻译:中风后抓握功能障碍的存在凸显了开发与实施先进代偿策略的迫切需求。本文提出一种新型系统,通过开发一种基于视觉的、具备触觉功能的软体额外机器人手指,以帮助慢性中风幸存者。该系统通过集成基于视觉的触觉传感,能够根据检测到的滑动自主调节抓握力。这种协同作用不仅确保了机械稳定性,还通过模拟人-物交互的动态特性,丰富了触觉反馈。我们方法的核心是一个基于Transformer的框架,该框架在一个全面的触觉数据集上进行训练,该数据集涵盖了具有广泛形态特性(包括形状、大小、重量、纹理和硬度变化)的物体。此外,我们在实际应用中验证了系统的鲁棒性,系统成功操控了多种日常物品。这些积极的结果凸显了该方法在改善中风幸存者生活质量方面的潜力。