Recurrent neural networks (RNNs) provide a theoretical framework for understanding computation in biological neural circuits, yet classical results, such as Hopfield's model of associative memory, rely on symmetric connectivity that restricts network dynamics to gradient-like flows. In contrast, biological networks support rich time-dependent behaviour facilitated by their asymmetry. Here we introduce a general framework, which we term drift-diffusion matching, for training continuous-time RNNs to represent arbitrary stochastic dynamical systems within a low-dimensional latent subspace. Allowing asymmetric connectivity, we show that RNNs can faithfully embed the drift and diffusion of a given stochastic differential equation, including nonlinear and nonequilibrium dynamics such as chaotic attractors. As an application, we construct RNN realisations of stochastic systems that transiently explore various attractors through both input-driven switching and autonomous transitions driven by nonequilibrium currents, which we interpret as models of associative and sequential (episodic) memory. To elucidate how these dynamics are encoded in the network, we introduce decompositions of the RNN based on its asymmetric connectivity and its time-irreversibility. Our results extend attractor neural network theory beyond equilibrium, showing that asymmetric neural populations can implement a broad class of dynamical computations within low-dimensional manifolds, unifying ideas from associative memory, nonequilibrium statistical mechanics, and neural computation.
翻译:循环神经网络(RNNs)为理解生物神经回路中的计算提供了一个理论框架,然而经典结果(如霍普菲尔德联想记忆模型)依赖于对称连接性,这限制了网络动力学仅表现为类梯度流。相比之下,生物网络凭借其非对称性支持丰富的时间依赖性行为。本文引入了一个通用框架,我们称之为漂移-扩散匹配,用于训练连续时间RNNs在低维潜在子空间中表示任意随机动力系统。通过允许非对称连接性,我们证明了RNNs能够忠实嵌入给定随机微分方程的漂移项与扩散项,包括非线性与非平衡动力学(如混沌吸引子)。作为应用,我们构建了随机系统的RNN实现,这些系统通过输入驱动切换以及由非平衡流驱动的自主跃迁,瞬时探索多种吸引子,我们将其解释为联想记忆与序列(情景)记忆的模型。为了阐明这些动力学如何在网络中编码,我们基于网络的非对称连接性及其时间不可逆性引入了RNN的分解方法。我们的研究将吸引子神经网络理论扩展至平衡态之外,表明非对称神经群体能够在低维流形中实现一大类动力学计算,从而统一了联想记忆、非平衡统计力学与神经计算中的思想。