Event-based machine learning promises more energy-efficient AI on future neuromorphic hardware. Here, we investigate how the recently discovered Eventprop algorithm for gradient descent on exact gradients in spiking neural networks can be scaled up to challenging keyword recognition benchmarks. We implemented Eventprop in the GPU-enhanced Neural Networks framework and used it for training recurrent spiking neural networks on the Spiking Heidelberg Digits and Spiking Speech Commands datasets. We found that learning depended strongly on the loss function and extended Eventprop to a wider class of loss functions to enable effective training. When combined with the right additional mechanisms from the machine learning toolbox, Eventprop networks achieved state-of-the-art performance on Spiking Heidelberg Digits and good accuracy on Spiking Speech Commands. This work is a significant step towards a low-power neuromorphic alternative to current machine learning paradigms.
翻译:基于事件的机器学习有望在未来神经形态硬件上实现更高能效的人工智能。本文研究了新近发现的Eventprop算法——一种用于尖峰神经网络精确梯度下降的方法——如何扩展到具有挑战性的关键词识别基准任务中。我们在GPU增强的神经网络框架中实现了Eventprop算法,并用于在Spiking Heidelberg Digits和Spiking Speech Commands数据集上训练循环尖峰神经网络。研究发现学习效果高度依赖于损失函数,为此我们将Eventprop扩展至更广泛的损失函数类别以实现有效训练。当结合机器学习工具箱中恰当的附加机制时,采用Eventprop的网络在Spiking Heidelberg Digits数据集上达到了最先进的性能,在Spiking Speech Commands数据集上也取得了良好的准确率。这项研究为开发低功耗神经形态计算以替代当前机器学习范式迈出了重要一步。