We study the problem of learning the law of linear stochastic partial differential equations (SPDEs) with additive Gaussian forcing from spatiotemporal observations. Most existing deep learning approaches either assume access to the driving noise or initial condition, or rely on deterministic surrogate models that fail to capture intrinsic stochasticity. We propose a structured latent-variable formulation that requires only observations of solution realizations and learns the underlying randomly forced dynamics. Our approach combines a spectral Galerkin projection with a truncated Wiener chaos expansion, yielding a principled separation between deterministic evolution and stochastic forcing. This reduces the infinite-dimensional SPDE to a finite system of parametrized ordinary differential equations governing latent temporal dynamics. The latent dynamics and stochastic forcing are jointly inferred through variational learning, allowing recovery of stochastic structure without explicit observation or simulation of noise during training. Empirical evaluation on synthetic data demonstrates state-of-the-art performance under comparable modeling assumptions across bounded and unbounded one-dimensional spatial domains.
翻译:本文研究从时空观测数据中学习具有加性高斯强迫的线性随机偏微分方程(SPDE)的统计规律问题。现有深度学习方法大多需要已知驱动噪声或初始条件,或依赖于无法捕捉内在随机性的确定性代理模型。我们提出一种结构化隐变量建模框架,仅需解的实现观测即可学习底层随机强迫动力学。该方法将谱伽辽金投影与截断Wiener混沌展开相结合,实现了确定性演化与随机强迫的原理性分离,从而将无限维SPDE约化为有限维参数化常微分方程组以控制隐式时间动力学。通过变分学习联合推断隐式动力学与随机强迫结构,使得在训练过程中无需显式观测或模拟噪声即可恢复随机特性。在合成数据上的实证评估表明,该方法在有界与无界一维空间域上,在可比较的建模假设下均取得了当前最优性能。