Dense associative memories (DAMs) store and retrieve patterns via energy-function based fixed points, but existing models are limited to vector representations. We extend DAMs to Gaussian densities equipped with the 2-Wasserstein distance. Our framework defines a log-sum-exp energy over stored distributions and a retrieval dynamics aggregating optimal transport maps in a Gibbs-weighted manner. Stationary points correspond to self-consistent Wasserstein barycenters, generalizing classical DAM fixed points. We prove exponential storage capacity and provide quantitative retrieval guarantees under Wasserstein perturbations. We validate the method on synthetic and real-world image (CelebA and CIFAR-10 datasets) and text (text8 and NLI corpus) datasets. By generalizing from vectors to distributions, our work bridges classical DAMs with modern generative modeling and paves way for distributional storage and retrieval in memory-augmented learning.
翻译:密集联想记忆(DAMs)通过基于能量函数的固定点来存储和检索模式,但现有模型仅限于向量表示。我们将DAMs扩展到配备2-Wasserstein距离的高斯密度。我们的框架在存储的分布上定义了一个对数-求和-指数能量,以及一种以吉布斯加权方式聚合最优传输映射的检索动态。平稳点对应于自洽的Wasserstein重心,推广了经典的DAM固定点。我们证明了指数存储容量,并在Wasserstein扰动下提供了定量检索保证。我们在合成数据集和真实世界图像(CelebA和CIFAR-10数据集)及文本(text8和NLI语料库)数据集上验证了该方法。通过从向量推广到分布,我们的工作将经典DAMs与现代生成建模连接起来,并为记忆增强学习中的分布存储与检索铺平了道路。