Spike-timing-dependent plasticity (STDP) provides a biologically-plausible learning mechanism for spiking neural networks (SNNs); however, Hebbian weight updates in architectures with recurrent connections suffer from pathological weight dynamics: unbounded growth, catastrophic forgetting, and loss of representational diversity. We propose a neuromorphic regularization scheme inspired by the synaptic homeostasis hypothesis: periodic offline phases during which external inputs are suppressed, synaptic weights undergo stochastic decay toward a homeostatic baseline, and spontaneous activity enables memory consolidation. We demonstrate that this sleep-wake cycle prevents weight saturation while preserving learned structure. Empirically, we find that low to intermediate sleep durations (10-20\% of training) improve stability on MNIST-like benchmarks in our STDP-SNN model, without any data-specific hyperparameter tuning. In contrast, the same sleep intervention yields no measurable benefit for the surrogate-gradient spiking neural network (SG-SNN). Taken together, these results suggest that periodic, sleep-based renormalization may represent a fundamental mechanism for stabilizing local Hebbian learning in neuromorphic systems, while also indicating that special care is required when integrating such protocols with existing gradient-based optimization methods.
翻译:脉冲时序依赖可塑性(STDP)为脉冲神经网络(SNNs)提供了一种生物可信的学习机制;然而,在具有循环连接的架构中,赫布式权重更新会遭受病理性权重动态的困扰:无界增长、灾难性遗忘以及表征多样性的丧失。受突触稳态假说启发,我们提出了一种神经形态正则化方案:周期性的离线阶段,在此期间外部输入被抑制,突触权重向稳态基线随机衰减,同时自发活动促进记忆巩固。我们证明这种睡眠-觉醒周期能够防止权重饱和,同时保留已学习到的结构。实验结果表明,在我们的STDP-SNN模型中,低至中等睡眠时长(训练时长的10-20%)可在MNIST类基准任务上提升稳定性,且无需任何数据特定的超参数调整。相比之下,相同的睡眠干预对替代梯度脉冲神经网络(SG-SNN)未产生可测量的收益。综合来看,这些结果表明周期性的基于睡眠的重新归一化可能代表了稳定神经形态系统中局部赫布学习的基本机制,同时也提示在将此类协议与现有基于梯度的优化方法整合时需要特别谨慎。