Kernel-based learning methods can dramatically increase the storage capacity of Hopfield networks, yet the dynamical mechanisms behind this enhancement remain poorly understood. We address this gap by combining a geometric characterization of the attractor landscape with the spectral theory of kernel machines. Using a novel metric, Pinnacle Sharpness, we empirically uncover a rich phase diagram of attractor stability, identifying a Ridge of Optimization where the network achieves maximal robustness under high-load conditions. Phenomenologically, this ridge is characterized by a Force Antagonism, in which a strong driving force is counterbalanced by a collective feedback force. We theoretically interpret this behavior as a consequence of a specific reorganization of the weight spectrum, which we term Spectral Concentration. Unlike a simple rank-1 collapse, our analysis shows that the network on the ridge self-organizes into a critical regime: the leading eigenvalue is amplified to enhance global stability (Direct Force), while the trailing eigenvalues remain finite to sustain high memory capacity (Indirect Force). Together, these results suggest a spectral mechanism by which learning reconciles stability and capacity in high-dimensional associative memory models.
翻译:基于核的学习方法能显著提升Hopfield网络的存储容量,但其背后的动力学机制仍未得到充分理解。本文通过结合吸引子景观的几何刻画与核机器的谱理论来填补这一研究空白。利用一种新颖的度量指标——峰锐度,我们通过实验揭示了吸引子稳定性的丰富相图,并识别出一个优化岭区域,在该区域下网络能在高负载条件下实现最大的鲁棒性。从现象学上看,该优化岭的特征表现为一种力拮抗机制:强大的驱动力被集体反馈力所平衡。我们从理论上将此行为解释为权重谱特定重组的结果,并将其称为谱集中。与简单的秩-1塌缩不同,我们的分析表明,位于优化岭上的网络会自组织进入一个临界态:主导特征值被放大以增强全局稳定性(直接力),而后续特征值保持有限值以维持高记忆容量(间接力)。这些结果共同揭示了一种谱机制,通过该机制学习过程在高维联想记忆模型中实现了稳定性与容量的统一。