Hebbian learning is a key principle underlying learning in biological neural networks. We relate a Hebbian spike-timing-dependent plasticity rule to noisy gradient descent with respect to a non-convex loss function on the probability simplex. Despite the constant injection of noise and the non-convexity of the underlying optimization problem, one can rigorously prove that the considered Hebbian learning dynamic identifies the presynaptic neuron with the highest activity and that the convergence is exponentially fast in the number of iterations. This is non-standard and surprising as typically noisy gradient descent with fixed noise level only converges to a stationary regime where the noise causes the dynamic to fluctuate around a minimiser.
翻译:Hebbian学习是生物神经网络中学习机制的核心原理。本文将一种基于脉冲时序依赖的Hebbian可塑性规则与概率单纯形上非凸损失函数的噪声梯度下降相关联。尽管存在持续噪声注入且底层优化问题具有非凸性,我们仍能严格证明:所考察的Hebbian学习动态能够识别出具有最高活性的突触前神经元,且其收敛速度在迭代次数上呈指数级加速。这一结论具有非典型性与意外性,因为通常固定噪声水平的噪声梯度下降仅能收敛至平稳状态,此时噪声会导致动态在极小值点附近持续波动。