Uncertainty in biological neural systems appears to be computationally beneficial rather than detrimental. However, in neuromorphic computing systems, device variability often limits performance, including accuracy and efficiency. In this work, we propose a spiking Bayesian neural network (SBNN) framework that unifies the dynamic models of intrinsic device stochasticity (based on Magnetic Tunnel Junctions) and stochastic threshold neurons to leverage noise as a functional Bayesian resource. Experiments demonstrate that SBNN achieves high accuracy (99.16% on MNIST, 94.84% on CIFAR10) with 8-bit precision. Meanwhile rate estimation method provides a ~20-fold training speedup. Furthermore, SBNN exhibits superior robustness, showing a 67% accuracy improvement under synaptic weight noise and 12% under input noise compared to standard spiking neural networks. Crucially, hardware validation confirms that physical device implementation causes invisible accuracy and calibration loss compared to the algorithmic model. Converting device stochasticity into neuronal uncertainty offers a route to compact, energy-efficient neuromorphic computing under uncertainty.
翻译:生物神经系统的随机性在计算上似乎具有增益而非损害作用。然而,在神经形态计算系统中,器件变异性常制约着性能表现,包括精度与效率。本研究提出一种脉冲贝叶斯神经网络框架,该框架将本征器件随机性(基于磁性隧道结)的动态模型与随机阈值神经元相统一,从而将噪声转化为功能性贝叶斯计算资源。实验表明,该网络在8位精度下实现了高分类准确率(MNIST数据集达99.16%,CIFAR10数据集达94.84%)。同时,速率估计方法使训练速度提升约20倍。此外,该网络展现出卓越的鲁棒性:在突触权重噪声干扰下,其准确率较标准脉冲神经网络提升67%;在输入噪声干扰下提升12%。关键的是,硬件验证证实,与算法模型相比,物理器件实现带来的精度损失与校准偏差可忽略不计。将器件随机性转化为神经元不确定性,为构建紧凑、高能效的不确定性条件下神经形态计算系统提供了可行路径。