We present PiMForce, a novel framework that enhances hand pressure estimation by leveraging 3D hand posture information to augment forearm surface electromyography (sEMG) signals. Our approach utilizes detailed spatial information from 3D hand poses in conjunction with dynamic muscle activity from sEMG to enable accurate and robust whole-hand pressure measurements under diverse hand-object interactions. We also developed a multimodal data collection system that combines a pressure glove, an sEMG armband, and a markerless finger-tracking module. We created a comprehensive dataset from 21 participants, capturing synchronized data of hand posture, sEMG signals, and exerted hand pressure across various hand postures and hand-object interaction scenarios using our collection system. Our framework enables precise hand pressure estimation in complex and natural interaction scenarios. Our approach substantially mitigates the limitations of traditional sEMG-based or vision-based methods by integrating 3D hand posture information with sEMG signals. Video demos, data, and code are available online.
翻译:我们提出了PiMForce,一种新颖的框架,它通过利用3D手部姿态信息来增强前臂表面肌电信号,从而改进手部压力估计。我们的方法结合来自3D手部姿态的详细空间信息与来自sEMG的动态肌肉活动,能够在多样化的手-物体交互场景下实现准确且鲁棒的全手压力测量。我们还开发了一个多模态数据采集系统,该系统集成了压力手套、sEMG臂环和无标记手指追踪模块。我们使用该采集系统,从21名参与者处创建了一个综合数据集,捕获了在各种手部姿态和手-物体交互场景下同步的手部姿态、sEMG信号和施加的手部压力数据。我们的框架能够在复杂且自然的交互场景中实现精确的手部压力估计。通过将3D手部姿态信息与sEMG信号相融合,我们的方法显著缓解了传统基于sEMG或基于视觉的方法的局限性。视频演示、数据和代码已在线发布。