Spiking Neural Networks (SNNs), as the third generation of neural networks, have gained prominence for their biological plausibility and computational efficiency, especially in processing diverse datasets. The integration of attention mechanisms, inspired by advancements in neural network architectures, has led to the development of Spiking Transformers. These have shown promise in enhancing SNNs' capabilities, particularly in the realms of both static and neuromorphic datasets. Despite their progress, a discernible gap exists in these systems, specifically in the Spiking Self Attention (SSA) mechanism's effectiveness in leveraging the temporal processing potential of SNNs. To address this, we introduce the Temporal Interaction Module (TIM), a novel, convolution-based enhancement designed to augment the temporal data processing abilities within SNN architectures. TIM's integration into existing SNN frameworks is seamless and efficient, requiring minimal additional parameters while significantly boosting their temporal information handling capabilities. Through rigorous experimentation, TIM has demonstrated its effectiveness in exploiting temporal information, leading to state-of-the-art performance across various neuromorphic datasets. The code is available at https://github.com/BrainCog-X/Brain-Cog/tree/main/examples/TIM.
翻译:脉冲神经网络(SNNs)作为第三代神经网络,凭借其生物合理性和计算效率而受到广泛关注,尤其在处理多样化数据集方面表现突出。受神经网络架构发展的启发,注意力机制的集成催生了脉冲Transformer(Spiking Transformer)的诞生。这类模型在增强SNN能力方面展现出巨大潜力,特别是在静态数据集和神经形态数据集领域。然而,尽管取得了一定进展,这些系统中仍存在明显不足,具体表现为脉冲自注意力机制在利用SNN时序处理潜力方面的有效性有待提升。为解决这一问题,我们提出了时序交互模块(TIM)——一种新颖的、基于卷积的增强模块,旨在提升SNN架构中的时序数据处理能力。TIM与现有SNN框架的集成无缝且高效,仅需极少的额外参数即可显著增强其时序信息处理能力。通过严格的实验验证,TIM在利用时序信息方面展现出卓越效能,并在多个神经形态数据集上取得了最先进的性能。相关代码已在https://github.com/BrainCog-X/Brain-Cog/tree/main/examples/TIM公开。