Since proposed, spiking neural networks (SNNs) gain recognition for their high performance, low power consumption and enhanced biological interpretability. However, while bringing these advantages, the binary nature of spikes also leads to considerable information loss in SNNs, ultimately causing performance degradation. We claim that the limited expressiveness of current binary spikes, resulting in substantial information loss, is the fundamental issue behind these challenges. To alleviate this, our research introduces a multi-bit information transmission mechanism for SNNs. This mechanism expands the output of spiking neurons from the original single bit to multiple bits, enhancing the expressiveness of the spikes and reducing information loss during the forward process, while still maintaining the low energy consumption advantage of SNNs. For SNNs, this represents a new paradigm of information transmission. Moreover, to further utilize the limited spikes, we extract effective signals from the previous layer to re-stimulate the neurons, thus encouraging full spikes emission across various bit levels. We conducted extensive experiments with our proposed method using both direct training method and ANN-SNN conversion method, and the results show consistent performance improvements.
翻译:自提出以来,脉冲神经网络(SNNs)因其高性能、低功耗和增强的生物可解释性而获得认可。然而,在带来这些优势的同时,脉冲的二元特性也导致SNNs中存在显著的信息损失,最终造成性能下降。我们认为,当前二元脉冲有限的表达能力导致大量信息损失,是这些挑战背后的根本问题。为缓解此问题,本研究为SNNs引入了一种多比特信息传输机制。该机制将脉冲神经元的输出从原始的单比特扩展至多比特,增强了脉冲的表达能力并减少了前向过程中的信息损失,同时仍保持SNNs的低能耗优势。对SNNs而言,这代表了一种新的信息传输范式。此外,为更充分利用有限的脉冲,我们从先前层提取有效信号以重新刺激神经元,从而促进不同比特层级上脉冲的充分发放。我们使用直接训练方法和ANN-SNN转换方法对所提方法进行了大量实验,结果表明其性能得到了一致提升。