Surface electromyography (sEMG) based gesture recognition offers a natural and intuitive interaction modality for wearable devices. Despite significant advancements in sEMG-based gesture-recognition models, existing methods often suffer from high computational latency and increased energy consumption. Additionally, the inherent instability of sEMG signals, combined with their sensitivity to distribution shifts in real-world settings, compromises model robustness. To tackle these challenges, we propose a novel SpGesture framework based on Spiking Neural Networks, which possesses several unique merits compared with existing methods: (1) Robustness: By utilizing membrane potential as a memory list, we pioneer the introduction of Source-Free Domain Adaptation into SNN for the first time. This enables SpGesture to mitigate the accuracy degradation caused by distribution shifts. (2) High Accuracy: With a novel Spiking Jaccard Attention, SpGesture enhances the SNNs' ability to represent sEMG features, leading to a notable rise in system accuracy. To validate SpGesture's performance, we collected a new sEMG gesture dataset which has different forearm postures, where SpGesture achieved the highest accuracy among the baselines ($89.26\%$). Moreover, the actual deployment on the CPU demonstrated a system latency below 100ms, well within real-time requirements. This impressive performance showcases SpGesture's potential to enhance the applicability of sEMG in real-world scenarios. The code is available at https://anonymous.4open.science/r/SpGesture.
翻译:基于表面肌电信号的手势识别为可穿戴设备提供了一种自然直观的交互方式。尽管基于表面肌电信号的手势识别模型已取得显著进展,但现有方法通常存在计算延迟高、能耗大的问题。此外,表面肌电信号固有的不稳定性,加上其对真实场景中分布偏移的敏感性,影响了模型的鲁棒性。为应对这些挑战,我们提出了一种基于脉冲神经网络的新型SpGesture框架,与现有方法相比具有以下独特优势:(1)鲁棒性:通过利用膜电位作为记忆列表,我们首次将无源域自适应技术引入脉冲神经网络。这使得SpGesture能够缓解因分布偏移导致的精度下降。(2)高精度:通过新颖的脉冲Jaccard注意力机制,SpGesture增强了脉冲神经网络表征表面肌电特征的能力,从而显著提升了系统精度。为验证SpGesture的性能,我们采集了包含不同前臂姿态的新表面肌电手势数据集,在该数据集上SpGesture在基线模型中取得了最高准确率($89.26\%$)。此外,在CPU上的实际部署表明系统延迟低于100毫秒,完全满足实时性要求。这一出色性能展示了SpGesture在提升表面肌电技术实际应用潜力方面的价值。代码公开于:https://anonymous.4open.science/r/SpGesture。