Recent advances in learning dynamical systems from data have shown significant promise. However, many existing methods assume access to the full state of the system -- an assumption that is rarely satisfied in practice, where systems are typically monitored through a limited number of sensors, leading to partial observability. To address this challenge, we draw inspiration from the Mori-Zwanzig formalism, which provides a theoretical connection between hidden variables and memory terms. Motivated by this perspective, we introduce a constant-lag Neural Delay Differential Equations (NDDEs) framework, providing a continuous-time approach for learning non-Markovian dynamics directly from data. These memory effects are captured using a finite set of time delays, which are identified via the adjoint method. We validate the proposed approach on a range of datasets, including synthetic systems, chaotic dynamics, and experimental measurements, such as the Kuramoto-Sivashinsky equation and cavity-flow experiments. Results demonstrate that NDDEs compare favourably with existing approaches for partially observed systems, including long short-term memory (LSTM) networks and augmented neural ordinary differential equations (ANODEs). Overall, NDDEs offer a principled and data-efficient framework for modelling non-Markovian dynamics under partial observability. An open-source implementation accompanies this article.
翻译:从数据中学习动力系统的最新进展已展现出显著前景。然而,许多现有方法假设能够获取系统的完整状态——这一假设在实践中很少得到满足,因为系统通常仅通过有限数量的传感器进行监测,从而导致部分可观测性。为应对这一挑战,我们从Mori-Zwanzig形式体系获得启发,该体系揭示了隐变量与记忆项之间的理论联系。基于这一视角,我们提出了常滞后神经延迟微分方程(NDDEs)框架,为直接从数据中学习非马尔可夫动力学提供了一种连续时间方法。这些记忆效应通过有限个时间延迟进行捕捉,并借助伴随方法进行识别。我们在包括合成系统、混沌动力学及实验测量(如Kuramoto-Sivashinsky方程和腔流实验)在内的多种数据集上验证了所提方法。结果表明,在部分观测系统的建模中,NDDEs相较于现有方法(包括长短期记忆网络和增广神经常微分方程)具有优势。总体而言,NDDEs为部分可观测条件下的非马尔可夫动力学建模提供了一个原理清晰且数据高效的框架。本文附有开源实现。