Hebbian learning limits Hopfield network storage capacity (pattern-to-neuron ratio around 0.14). We propose Kernel Logistic Regression (KLR) learning. Unlike linear methods, KLR uses kernels to implicitly map patterns to high-dimensional feature space, enhancing separability. By learning dual variables, KLR dramatically improves storage capacity, achieving perfect recall even when pattern numbers exceed neuron numbers (up to ratio 1.5 shown), and enhances noise robustness. KLR demonstrably outperforms Hebbian and linear logistic regression approaches.
翻译:Hebbian学习限制了Hopfield网络的存储容量(模式与神经元比例约为0.14)。我们提出了核逻辑回归(KLR)学习算法。与线性方法不同,KLR通过核函数将模式隐式映射到高维特征空间,从而增强模式的可分离性。通过学习对偶变量,KLR显著提升了存储容量,即使当模式数量超过神经元数量时(实验显示比例可达1.5)仍能实现完美回忆,并增强了噪声鲁棒性。实验证明,KLR的性能明显优于Hebbian学习与线性逻辑回归方法。