Neural networks storing multiple discrete attractors are canonical models of biological memory. Previously, the dynamical stability of such networks could only be guaranteed under highly restrictive conditions. Here, we derive a theory of the local stability of discrete fixed points in a broad class of networks with graded neural activities and in the presence of noise. By directly analyzing the bulk and the outliers of the Jacobian spectrum, we show that all fixed points are stable below a critical load that is distinct from the classical \textit{critical capacity} and depends on the statistics of neural activities in the fixed points as well as the single-neuron activation function. Our analysis highlights the computational benefits of threshold-linear activation and sparse-like patterns.
翻译:存储多个离散吸引子的神经网络是生物记忆的典范模型。此前,此类网络的动力学稳定性只能在高度受限的条件下得到保证。本文推导了一类具有分级神经活动且在噪声存在下的网络中离散不动点局部稳定性的理论。通过直接分析雅可比矩阵谱的体分布和离群值,我们证明所有不动点在低于某一临界负载时均是稳定的,该临界负载不同于经典的\textit{临界容量},且取决于不动点中神经活动的统计特性以及单神经元激活函数。我们的分析凸显了阈值线性激活与类稀疏模式的计算优势。