Spiking neural networks (SNNs) represent a promising approach in machine learning, combining the hierarchical learning capabilities of deep neural networks with the energy efficiency of spike-based computations. Traditional end-to-end training of SNNs is often based on back-propagation, where weight updates are derived from gradients computed through the chain rule. However, this method encounters challenges due to its limited biological plausibility and inefficiencies on neuromorphic hardware. In this study, we introduce an alternative training approach for SNNs. Instead of using back-propagation, we leverage weight perturbation methods within a forward-mode gradient framework. Specifically, we perturb the weight matrix with a small noise term and estimate gradients by observing the changes in the network output. Experimental results on regression tasks, including solving various PDEs, show that our approach achieves competitive accuracy, suggesting its suitability for neuromorphic systems and potential hardware compatibility.
翻译:脉冲神经网络(SNNs)作为机器学习中一种极具前景的方法,将深度神经网络的层次化学习能力与基于脉冲计算的高能效特性相结合。传统的SNNs端到端训练通常基于反向传播,其权重更新通过链式法则计算梯度得到。然而,该方法因生物合理性有限且在神经形态硬件上效率低下而面临挑战。本研究提出一种替代性的SNNs训练方法。我们摒弃反向传播,在前向模式梯度框架内采用权重扰动方法。具体而言,我们通过引入微小噪声项扰动权重矩阵,并通过观察网络输出的变化来估计梯度。在回归任务(包括求解多种偏微分方程)上的实验结果表明,该方法能达到具有竞争力的精度,表明其适用于神经形态系统并具备潜在的硬件兼容性。