The brain-inspired Spiking Neural Networks (SNNs) have garnered considerable research interest due to their superior performance and energy efficiency in processing temporal signals. Recently, a novel multi-compartment spiking neuron model, namely the Two-Compartment LIF (TC-LIF) model, has been proposed and exhibited a remarkable capacity for sequential modelling. However, training the TC-LIF model presents challenges stemming from the large memory consumption and the issue of gradient vanishing associated with the Backpropagation Through Time (BPTT) algorithm. To address these challenges, online learning methodologies emerge as a promising solution. Yet, to date, the application of online learning methods in SNNs has been predominantly confined to simplified Leaky Integrate-and-Fire (LIF) neuron models. In this paper, we present a novel online learning method specifically tailored for networks of TC-LIF neurons. Additionally, we propose a refined TC-LIF neuron model called Adaptive TC-LIF, which is carefully designed to enhance temporal information integration in online learning scenarios. Extensive experiments, conducted on various sequential benchmarks, demonstrate that our approach successfully preserves the superior sequential modeling capabilities of the TC-LIF neuron while incorporating the training efficiency and hardware friendliness of online learning. As a result, it offers a multitude of opportunities to leverage neuromorphic solutions for processing temporal signals.
翻译:受大脑启发的脉冲神经网络(SNN)因其在处理时序信号时的卓越性能和能量效率而引起了广泛的研究兴趣。最近,一种新颖的多房室脉冲神经元模型——双房室LIF(TC-LIF)模型被提出,并展现出卓越的序列建模能力。然而,训练TC-LIF模型面临巨大内存消耗和与时序反向传播(BPTT)算法相关梯度消失问题的挑战。为解决这些问题,在线学习方法成为有前景的解决方案。但迄今为止,在线学习方法在SNN中的应用主要局限于简化的漏积分点火(LIF)神经元模型。本文提出了一种专门针对TC-LIF神经元网络的在线学习方法。此外,我们提出了一种改进的TC-LIF神经元模型——自适应TC-LIF,该模型专门设计用于增强在线学习场景中的时序信息整合。在多个时序基准任务上进行的广泛实验表明,我们的方法成功保留了TC-LIF神经元的卓越序列建模能力,同时兼具在线学习的训练效率与硬件友好性。因此,它为利用神经形态方案处理时序信号提供了众多机会。