Spiking Neural Networks (SNNs) seek to mimic the spiking behavior of biological neurons and are expected to play a key role in the advancement of neural computing and artificial intelligence. The efficiency of SNNs is often determined by the neural coding schemes. Existing coding schemes either cause huge delays and energy consumption or necessitate intricate neuron models and training techniques. To address these issues, we propose a novel Stepwise Weighted Spike (SWS) coding scheme to enhance the encoding of information in spikes. This approach compresses the spikes by weighting the significance of the spike in each step of neural computation, achieving high performance and low energy consumption. A Ternary Self-Amplifying (TSA) neuron model with a silent period is proposed for supporting SWS-based computing, aimed at minimizing the residual error resulting from stepwise weighting in neural computation. Our experimental results show that the SWS coding scheme outperforms the existing neural coding schemes in very deep SNNs, and significantly reduces operations and latency.
翻译:脉冲神经网络(SNNs)旨在模拟生物神经元的脉冲发放行为,并有望在神经计算与人工智能的发展中发挥关键作用。SNNs的效率通常取决于其神经编码方案。现有的编码方案要么导致巨大的延迟和能量消耗,要么需要复杂的神经元模型和训练技术。为解决这些问题,我们提出了一种新颖的逐步加权脉冲(SWS)编码方案,以增强脉冲中的信息编码。该方法通过在神经计算的每一步对脉冲的重要性进行加权来压缩脉冲,从而实现高性能与低能耗。为支持基于SWS的计算,我们提出了一种具有静默期的三元自放大(TSA)神经元模型,旨在最小化神经计算中由逐步加权引起的残差误差。实验结果表明,在极深的SNNs中,SWS编码方案优于现有的神经编码方案,并显著减少了运算量和延迟。