Spiking neural networks (SNNs) offer a biologically grounded and energy-efficient alternative to conventional neural architectures; however, they struggle with long-range temporal dependencies due to fixed synaptic and membrane time constants. This paper introduces ChronoPlastic Spiking Neural Networks (CPSNNs), a novel architectural principle that enables adaptive temporal credit assignment by dynamically modulating synaptic decay rates conditioned on the state of the network. CPSNNs maintain multiple internal temporal traces and learn a continuous time-warping function that selectively preserves task-relevant information while rapidly forgetting noise. Unlike prior approaches based on adaptive membrane constants, attention mechanisms, or external memory, CPSNNs embed temporal control directly within local synaptic dynamics, preserving linear-time complexity and neuromorphic compatibility. We provide a formal description of the model, analyze its computational properties, and demonstrate empirically that CPSNNs learn long-gap temporal dependencies significantly faster and more reliably than standard SNN baselines. Our results suggest that adaptive temporal modulation is a key missing ingredient for scalable temporal learning in spiking systems.
翻译:脉冲神经网络(SNNs)为传统神经架构提供了一种基于生物学原理且高能效的替代方案;然而,由于其固定的突触和膜时间常数,在处理长程时间依赖性方面存在困难。本文提出时序可塑性脉冲神经网络(CPSNNs),这是一种新颖的架构原理,通过根据网络状态动态调节突触衰减率,实现了自适应的时间信用分配。CPSNNs 维持多个内部时间迹,并学习一个连续的时间扭曲函数,该函数选择性地保留任务相关信息,同时快速遗忘噪声。与以往基于自适应膜常数、注意力机制或外部存储器的方法不同,CPSNNs 将时序控制直接嵌入局部突触动力学中,保持了线性时间复杂度和神经形态兼容性。我们提供了该模型的形式化描述,分析了其计算特性,并通过实验证明,CPSNNs 学习长间隔时间依赖性的速度显著快于标准 SNN 基线,且更为可靠。我们的结果表明,自适应时间调制是脉冲系统中实现可扩展时序学习的一个关键缺失要素。