Neural operators have recently grown in popularity as Partial Differential Equation (PDE) surrogate models. Learning solution functionals, rather than functions, has proven to be a powerful approach to calculate fast, accurate solutions to complex PDEs. While much work has been done evaluating neural operator performance on a wide variety of surrogate modeling tasks, these works normally evaluate performance on a single equation at a time. In this work, we develop a novel contrastive pretraining framework utilizing Generalized Contrastive Loss that improves neural operator generalization across multiple governing equations simultaneously. Governing equation coefficients are used to measure ground-truth similarity between systems. A combination of physics-informed system evolution and latent-space model output are anchored to input data and used in our distance function. We find that physics-informed contrastive pretraining improves accuracy for the Fourier Neural Operator in fixed-future and autoregressive rollout tasks for the 1D and 2D Heat, Burgers', and linear advection equations.
翻译:神经算子作为偏微分方程(PDE)代理模型近年来日益受到关注。学习解泛函而非函数,已被证明是计算复杂偏微分方程快速精确解的有效方法。尽管已有大量工作评估神经算子在各类代理建模任务中的性能,但这些研究通常仅针对单一方程进行评估。本文提出一种基于广义对比损失的新型对比预训练框架,旨在同时提升神经算子在多个控制方程上的泛化能力。我们利用控制方程系数度量系统间的真实相似性,将物理信息驱动的系统演化与隐空间模型输出相结合,并将其锚定于输入数据以构建距离函数。实验表明,物理信息对比预训练显著提升了傅里叶神经算子在固定未来时刻预测和自回归推演任务中的精度,测试涵盖一维/二维热传导方程、Burgers方程及线性平流方程。