The prevailing of artificial intelligence-of-things calls for higher energy-efficient edge computing paradigms, such as neuromorphic agents leveraging brain-inspired spiking neural network (SNN) models based on spatiotemporally sparse binary activations. However, the lack of efficient and high-accuracy deep SNN learning algorithms prevents them from practical edge deployments with a strictly bounded cost. In this paper, we propose a spatiotemporal orthogonal propagation (STOP) algorithm to tack this challenge. Our algorithm enables fully synergistic learning of synaptic weights as well as firing thresholds and leakage factors in spiking neurons to improve SNN accuracy, while under a unified temporally-forward trace-based framework to mitigate the huge memory requirement for storing neural states of all time-steps in the forward pass. Characteristically, the spatially-backward neuronal errors and temporally-forward traces propagate orthogonally to and independently of each other, substantially reducing computational overhead. Our STOP algorithm obtained high recognition accuracies of 99.53%, 94.84%, 74.92%, 98.26% and 77.10% on the MNIST, CIFAR-10, CIFAR-100, DVS-Gesture and DVS-CIFAR10 datasets with adequate SNNs of intermediate scales from LeNet-5 to ResNet-18. Compared with other deep SNN training works, our method is more plausible for edge intelligent scenarios where resources are limited but high-accuracy in-situ learning is desired.
翻译:人工智能物联网的普及对更高能效的边缘计算范式提出了需求,例如利用基于时空稀疏二元激活的类脑脉冲神经网络模型的神经形态智能体。然而,缺乏高效且高精度的深度脉冲神经网络学习算法,阻碍了其在成本严格受限的实际边缘部署中的应用。本文提出一种时空正交传播算法以应对这一挑战。我们的算法能够协同学习脉冲神经元中的突触权重、发放阈值和泄漏因子,以提高脉冲神经网络的精度,同时在一个统一的、基于时间前向迹的框架下进行,以缓解前向传播中存储所有时间步神经状态所带来的巨大内存需求。其特点是,空间后向的神经元误差与时间前向的迹彼此正交且独立地传播,从而显著降低了计算开销。我们的STOP算法在MNIST、CIFAR-10、CIFAR-100、DVS-Gesture和DVS-CIFAR10数据集上,使用从LeNet-5到ResNet-18的中等规模脉冲神经网络,分别获得了99.53%、94.84%、74.92%、98.26%和77.10%的高识别准确率。与其他深度脉冲神经网络训练工作相比,我们的方法更适用于资源有限但需要高精度原位学习的边缘智能场景。