Spiking Neural Networks (SNNs) draw inspiration from biological neurons to enable brain-like computation, demonstrating effectiveness in processing temporal information with energy efficiency and biological realism. Most existing SNNs are based on neural dynamics such as the (leaky) integrate-and-fire (IF/LIF) models, which are described by first-order ordinary differential equations (ODEs) with Markovian characteristics. This means the potential state at any time depends solely on its immediate past value, potentially limiting network expressiveness. Empirical studies of real neurons, however, reveal long-range correlations and fractal dendritic structures, suggesting non-Markovian behavior better modeled by fractional-order ODEs.Motivated by this, we propose a fractional-order spiking neural network (f-SNN) framework that strictly generalizes integer-order SNNs and captures long-term dependencies in membrane potential and spike trains via fractional dynamics, enabling richer temporal patterns. We also release an open-source toolbox to support the f-SNN framework, applicable to diverse architectures and real-world tasks. Experimentally, fractional adaptations of established SNNs into the f-SNN framework achieve superior accuracy, comparable energy efficiency, and improved robustness to noise, underscoring the promise of f-SNNs as an effective extension of traditional SNNs.
翻译:脉冲神经网络(SNNs)受生物神经元启发,实现了类脑计算,在处理时序信息方面展现出高效能和生物真实性。现有SNN大多基于(泄漏)积分发放(IF/LIF)等神经动力学模型,这些模型由具有马尔可夫特性的一阶常微分方程(ODEs)描述。这意味着任意时刻的膜电位状态仅取决于其前一时刻的值,可能限制了网络表达能力。然而,对真实神经元的实证研究揭示了长程关联性和分形树突结构,表明其非马尔可夫行为更适合用分数阶ODEs建模。受此启发,我们提出分数阶脉冲神经网络(f-SNN)框架,该框架严格推广了整数阶SNN,并通过分数阶动力学捕捉膜电位与脉冲序列中的长期依赖关系,从而生成更丰富的时序模式。我们还发布了开源工具箱以支持f-SNN框架,可适用于多种架构和实际任务。实验表明,将经典SNN改进为f-SNN框架后,在保持相近能效的同时获得了更高的精度,并提升了噪声鲁棒性,这印证了f-SNN作为传统SNN有效扩展的潜力。