In a spiking neural network, is it enough for each neuron to spike at most once? In recent work, approximation bounds for spiking neural networks have been derived, quantifying how well they can fit target functions. However, these results are only valid for neurons that spike at most once, which is commonly thought to be a strong limitation. Here, we show that the opposite is true for a large class of spiking neuron models, including the commonly used leaky integrate-and-fire model with subtractive reset: for every approximation bound that is valid for a set of multi-spike neural networks, there is an equivalent set of single-spike neural networks with only linearly more neurons (in the maximum number of spikes) for which the bound holds. The same is true for the reverse direction too, showing that regarding their approximation capabilities in general machine learning tasks, single-spike and multi-spike neural networks are equivalent. Consequently, many approximation results in the literature for single-spike neural networks also hold for the multi-spike case.
翻译:在脉冲神经网络中,每个神经元最多发放一次脉冲是否足够?近期研究推导了脉冲神经网络的近似界,量化了它们拟合目标函数的能力。然而,这些结果仅适用于最多发放一次脉冲的神经元,这通常被认为是一个严重的限制。本文证明,对于包括常用的带减性重置的漏积分发放模型在内的一大类脉冲神经元模型,情况恰恰相反:对于任意适用于多峰神经元网络的近似界,都存在一个仅需线性增加神经元数量(相对于最大脉冲数)的等效单峰神经元网络集合,使得该界同样成立。反之亦然,这表明在一般机器学习任务的近似能力方面,单峰与多峰脉冲神经网络是等价的。因此,文献中许多针对单峰脉冲神经网络的近似结果同样适用于多峰情形。