By exploiting discrete signal processing and simulating brain neuron communication, Spiking Neural Networks (SNNs) offer a low-energy alternative to Artificial Neural Networks (ANNs). However, existing SNN models, still face high computational costs due to the numerous time steps as well as network depth and scale. The tens of billions of neurons and trillions of synapses in the human brain are developed from only 20,000 genes, which inspires us to design an efficient genetic encoding strategy that dynamic evolves to regulate large-scale deep SNNs at low cost. Therefore, we first propose a genetically scaled SNN encoding scheme that incorporates globally shared genetic interactions to indirectly optimize neuronal encoding instead of weight, which obviously brings about reductions in parameters and energy consumption. Then, a spatio-temporal evolutionary framework is designed to optimize the inherently initial wiring rules. Two dynamic regularization operators in the fitness function evolve the neuronal encoding to a suitable distribution and enhance information quality of the genetic interaction respectively, substantially accelerating evolutionary speed and improving efficiency. Experiments show that our approach compresses parameters by approximately 50\% to 80\%, while outperforming models on the same architectures by 0.21\% to 4.38\% on CIFAR-10, CIFAR-100 and ImageNet. In summary, the consistent trends of the proposed genetically encoded spatio-temporal evolution across different datasets and architectures highlight its significant enhancements in terms of efficiency, broad scalability and robustness, demonstrating the advantages of the brain-inspired evolutionary genetic coding for SNN optimization.
翻译:脉冲神经网络通过利用离散信号处理并模拟大脑神经元通信,为人工神经网络提供了一种低能耗的替代方案。然而,由于需要大量时间步以及网络深度与规模,现有SNN模型仍面临高昂的计算成本。人脑中数百亿神经元和数万亿突触仅由约两万个基因发育而成,这启发我们设计一种高效遗传编码策略,通过动态进化以低成本调控大规模深度SNN。为此,我们首先提出一种基因尺度SNN编码方案,该方案引入全局共享的遗传相互作用来间接优化神经元编码而非权重,从而显著降低参数数量和能耗。随后,设计了一个时空进化框架以优化固有的初始连接规则。适应度函数中的两个动态正则化算子分别驱动神经元编码向适宜分布进化并提升遗传相互作用的信息质量,大幅加速进化速度并提高效率。实验表明,该方法在CIFAR-10、CIFAR-100和ImageNet数据集上,将参数压缩约50%至80%的同时,在相同架构上比基准模型性能提升0.21%至4.38%。综上所述,所提出的遗传编码时空进化方法在不同数据集和架构中表现出一致的优化趋势,显著提升了效率、扩展性和鲁棒性,证明了类脑进化遗传编码在SNN优化中的优势。