By leveraging tools from the statistical mechanics of complex systems, in these short notes we extend the architecture of a neural network for hetero-associative memory (called three-directional associative memories, TAM) to explore supervised and unsupervised learning protocols. In particular, by providing entropic-heterogeneous datasets to its various layers, we predict and quantify a new emergent phenomenon -- that we term {\em layer's cooperativeness} -- where the interplay of dataset entropies across network's layers enhances their retrieval capabilities Beyond those they would have without reciprocal influence. Naively we would expect layers trained with less informative datasets to develop smaller retrieval regions compared to those pertaining to layers that experienced more information: this does not happen and all the retrieval regions settle to the same amplitude, allowing for optimal retrieval performance globally. This cooperative dynamics marks a significant advancement in understanding emergent computational capabilities within disordered systems.
翻译:通过运用复杂系统统计力学的工具,在这篇简短的研究笔记中,我们拓展了一种用于异质联想记忆的神经网络架构(称为三向联想记忆,TAM),以探索有监督和无监督的学习协议。特别地,通过向其不同层提供具有熵异质性的数据集,我们预测并量化了一种新的涌现现象——我们称之为{\em 层间协同性}——即网络各层间数据集熵的相互作用,能够增强其检索能力,超越它们在无相互影响时所能达到的水平。直观上,我们可能预期使用信息量较少的数据集进行训练的层,其检索区域会比那些经历了更多信息输入的层更小:但这种情况并未发生,所有检索区域最终都稳定在相同的幅度,从而实现了全局最优的检索性能。这种协同动力学标志着在理解无序系统内涌现计算能力方面取得了重要进展。