Biological neural networks self-organize according to local synaptic modifications to produce stable computations. How modifications at the synaptic level give rise to such computations at the network level remains an open question. Pehlevan et al. [Neur. Comp. 27 (2015), 1461--1495] proposed a model of a self-organizing neural network with Hebbian and anti-Hebbian synaptic updates that implements an algorithm for principal subspace analysis; however, global stability of the nonlinear synaptic dynamics has not been established. Here, for the case that the feedforward and recurrent weights evolve at the same timescale, we prove global stability of the continuum limit of the synaptic dynamics and show that the dynamics evolve in two phases. In the first phase, the synaptic weights converge to an invariant manifold where the `neural filters' are orthonormal. In the second phase, the synaptic dynamics follow the gradient flow of a non-convex potential function whose minima correspond to neural filters that span the principal subspace of the input data.
翻译:生物神经网络通过局部突触修饰实现自组织,从而产生稳定的计算。突触层面的修饰如何引发网络层面的此类计算,仍是一个未解之谜。Pehlevan等人[Neur. Comp. 27 (2015), 1461--1495]提出了一种具有海布与反海布突触更新的自组织神经网络模型,该模型实现了主成分子空间分析算法;然而,非线性突触动力学的全局稳定性尚未得到证明。本文针对前馈权重与循环权重以相同时间尺度演化的情况,证明了突触动力学连续极限的全局稳定性,并揭示了该动力学演化包含两个阶段。在第一阶段,突触权重收敛至一个不变流形,其中“神经滤波器”保持正交归一化。在第二阶段,突触动力学遵循一个非凸势能函数的梯度流,该势能函数的极小值对应于张成输入数据主成分子空间的神经滤波器。