We present a Spiking Neural Network (SNN) model that incorporates learnable synaptic delays through two approaches: per-synapse delay learning via Dilated Convolutions with Learnable Spacings (DCLS) and a dynamic pruning strategy that also serves as a form of delay learning. In the latter approach, the network dynamically selects and prunes connections, optimizing the delays in sparse connectivity settings. We evaluate both approaches on the Raw Heidelberg Digits keyword spotting benchmark using Backpropagation Through Time with surrogate gradients. Our analysis of the spatio-temporal structure of synaptic interactions reveals that, after training, excitation and inhibition group together in space and time. Notably, the dynamic pruning approach, which employs DEEP R for connection removal and RigL for reconnection, not only preserves these spatio-temporal patterns but outperforms per-synapse delay learning in sparse networks. Our results demonstrate the potential of combining delay learning with dynamic pruning to develop efficient SNN models for temporal data processing. Moreover, the preservation of spatio-temporal dynamics throughout pruning and rewiring highlights the robustness of these features, providing a solid foundation for future neuromorphic computing applications.
翻译:我们提出了一种脉冲神经网络(SNN)模型,该模型通过两种方法引入可学习的突触延迟:一是通过具有可学习间隔的扩张卷积(DCLS)实现每个突触的延迟学习,二是采用一种同样可作为延迟学习形式的动态剪枝策略。在后一种方法中,网络动态选择并剪除连接,从而在稀疏连接设置中优化延迟。我们使用时序反向传播结合代理梯度方法,在Raw Heidelberg Digits关键词检测基准上评估了这两种方法。我们对突触相互作用的时空结构进行分析后发现,训练后兴奋和抑制在空间和时间上会聚集在一起。值得注意的是,采用DEEP R进行连接移除和RigL进行重连接的动态剪枝方法,不仅保留了这些时空模式,而且在稀疏网络中的表现优于每个突触的延迟学习方法。我们的结果表明,将延迟学习与动态剪枝相结合,有望开发出用于时序数据处理的高效SNN模型。此外,时空动力学特征在剪枝和重连过程中得以保持,凸显了这些特征的鲁棒性,为未来的神经形态计算应用奠定了坚实基础。