Human motion is fundamentally driven by continuous physical interaction with the environment. Whether walking, running, or simply standing, the forces exchanged between our feet and the ground provide crucial insights for understanding and reconstructing human movement. Recent advances in wearable insole devices offer a compelling solution for capturing these forces in diverse, real-world scenarios. Sensor insoles pose no constraint on the users' motion (unlike mocap suits) and are unaffected by line-of-sight limitations (in contrast to optical systems). These qualities make sensor insoles an ideal choice for robust, unconstrained motion capture, particularly in outdoor environments. Surprisingly, leveraging these devices with recent motion reconstruction methods remains largely unexplored. Aiming to fill this gap, we present Step2Motion, the first approach to reconstruct human locomotion from multi-modal insole sensors. Our method utilizes pressure and inertial data-accelerations and angular rates-captured by the insoles to reconstruct human motion. We evaluate the effectiveness of our approach across a range of experiments to show its versatility for diverse locomotion styles, from simple ones like walking or jogging up to moving sideways, on tiptoes, slightly crouching, or dancing.
翻译:人体运动从根本上由与环境的持续物理交互所驱动。无论是行走、奔跑还是单纯站立,足部与地面之间的作用力为理解和重建人体运动提供了关键信息。近年来,可穿戴鞋垫设备的进展为在多样化真实场景中捕捉这些作用力提供了极具前景的解决方案。传感鞋垫不会对使用者的运动施加限制(不同于动作捕捉服),且不受视线遮挡影响(与光学系统相比)。这些特性使传感鞋垫成为鲁棒、无约束运动捕捉的理想选择,尤其在户外环境中。值得注意的是,利用这些设备结合近期运动重建方法的研究仍鲜有探索。为填补这一空白,我们提出了Step2Motion——首个基于多模态鞋垫传感器重建人体步态运动的方法。我们的方法利用鞋垫采集的压力与惯性数据(加速度与角速率)来重建人体运动。我们通过一系列实验评估了该方法的有效性,结果表明其适用于多样化的步态模式,从简单的行走或慢跑,到侧向移动、踮脚行走、微蹲行进乃至舞蹈动作。