Brains have evolved a diverse set of neurons with varying morphologies, physiological properties and rich dynamics that impact their processing of temporal information. By contrast, most neural network models include a homogeneous set of units that only vary in terms of their spatial parameters (weights and biases). To investigate the importance of temporal parameters to neural function, we trained spiking neural networks on tasks of varying temporal complexity, with different subsets of parameters held constant. We find that in a tightly resource constrained setting, adapting conduction delays is essential to solve all test conditions, and indeed that it is possible to solve these tasks using only temporal parameters (delays and time constants) with weights held constant. In the most complex spatio-temporal task we studied, we found that an adaptable bursting parameter was essential. More generally, allowing for adaptation of both temporal and spatial parameters increases network robustness to noise, an important feature for both biological brains and neuromorphic computing systems. In summary, our findings highlight how rich and adaptable dynamics are key to solving temporally structured tasks at a low neural resource cost, which may be part of the reason why biological neurons vary so dramatically in their physiological properties.
翻译:大脑演化出具有不同形态、生理特性和丰富动力学的多样化神经元,这些特性影响其对时间信息的处理。相比之下,大多数神经网络模型包含一组同质化的单元,仅在其空间参数(权重和偏置)上有所差异。为探究时间参数对神经功能的重要性,我们训练脉冲神经网络完成不同时间复杂度的任务,并固定不同子集的参数。研究发现,在资源严格受限的条件下,适应传导延迟是解决所有测试条件的关键——实际上,仅使用时间参数(延迟和时间常数)而保持权重恒定,即可解决这些任务。在我们研究的最复杂时空任务中,可适应的爆发参数至关重要。更广泛而言,允许时间与空间参数的适应会增强网络对噪声的鲁棒性,这对生物大脑和神经形态计算系统均具有重要意义。总之,我们的发现揭示了丰富且可适应的动力学如何以低神经资源成本解决时间结构化任务,这可能是生物神经元生理特性存在显著差异的部分原因。