Spiking Neural Networks (SNNs) draw inspiration from biological neurons to enable brain-like computation, demonstrating effectiveness in processing temporal information with energy efficiency and biological realism. Most existing SNNs are based on neural dynamics such as the (leaky) integrate-and-fire (IF/LIF) models, which are described by first-order ordinary differential equations (ODEs) with Markovian characteristics. This means the potential state at any time depends solely on its immediate past value, potentially limiting network expressiveness. Empirical studies of real neurons, however, reveal long-range correlations and fractal dendritic structures, suggesting non-Markovian behavior better modeled by fractional-order ODEs. Motivated by this, we propose a fractional-order spiking neural network (f-SNN) framework that strictly generalizes integer-order SNNs and captures long-term dependencies in membrane potential and spike trains via fractional dynamics, enabling richer temporal patterns. We further release an open-source toolbox, spikeDE, to support the f-SNN framework across diverse architectures and real-world tasks. Experimentally, fractional adaptations of established SNNs into the f-SNN framework achieve superior accuracy, comparable energy efficiency, and improved robustness to noise, underscoring the promise of f-SNNs as an effective extension of traditional SNNs.
翻译:脉冲神经网络(SNNs)受生物神经元启发,实现了类脑计算,在处理时序信息时展现出高能效与生物逼真度的优势。现有SNN大多基于(泄漏)积分发放(IF/LIF)等神经元动力学模型,这些模型由具有马尔可夫特性的一阶常微分方程(ODEs)描述。这意味着任意时刻的膜电位状态仅取决于其前一时刻的值,可能限制了网络的表达能力。然而,对真实神经元的实证研究揭示了长程关联与分形树突结构,表明其非马尔可夫行为更适合用分数阶ODE建模。受此启发,我们提出一种分数阶脉冲神经网络(f-SNN)框架,该框架严格推广了整数阶SNN,并通过分数阶动力学捕捉膜电位与脉冲序列中的长期依赖关系,从而支持更丰富的时序模式。我们进一步发布了开源工具箱spikeDE,以支持不同架构和实际任务中的f-SNN框架构建。实验表明,将经典SNN适配至f-SNN框架后,在保持相近能效的同时,获得了更高的精度与更强的噪声鲁棒性,这印证了f-SNN作为传统SNN有效扩展的潜力。