Neural operators have recently grown in popularity as Partial Differential Equation (PDE) surrogate models. Learning solution functionals, rather than functions, has proven to be a powerful approach to calculate fast, accurate solutions to complex PDEs. While much work has been done evaluating neural operator performance on a wide variety of surrogate modeling tasks, these works normally evaluate performance on a single equation at a time. In this work, we develop a novel contrastive pretraining framework utilizing Generalized Contrastive Loss that improves neural operator generalization across multiple governing equations simultaneously. Governing equation coefficients are used to measure ground-truth similarity between systems. A combination of physics-informed system evolution and latent-space model output are anchored to input data and used in our distance function. We find that physics-informed contrastive pretraining improves accuracy for the Fourier Neural Operator in fixed-future and autoregressive rollout tasks for the 1D and 2D Heat, Burgers', and linear advection equations.
翻译:神经算子作为偏微分方程(PDE)代理模型近年来日益受到关注。学习解泛函而非函数,已被证明是计算复杂偏微分方程快速精确解的有效方法。尽管已有大量工作评估神经算子在各类代理建模任务中的性能,但这些研究通常仅针对单一方程进行性能评估。本研究提出了一种新颖的对比预训练框架,利用广义对比损失函数,可同时提升神经算子在多个控制方程间的泛化能力。控制方程系数被用于度量系统间的真实相似性。我们将物理信息驱动的系统演化与潜在空间模型输出相结合,锚定输入数据并应用于距离函数构建。实验表明,在1D和2D热传导方程、Burgers方程及线性平流方程的固定未来预测与自回归推演任务中,物理信息对比预训练显著提升了傅里叶神经算子的精度。