Transformer models are increasingly used for solving Partial Differential Equations (PDEs). Several adaptations have been proposed, all of which suffer from the typical problems of Transformers, such as quadratic memory and time complexity. Furthermore, all prevalent architectures for PDE solving lack at least one of several desirable properties of an ideal surrogate model, such as (i) generalization to PDE parameters not seen during training, (ii) spatial and temporal zero-shot super-resolution, (iii) continuous temporal extrapolation, (iv) support for 1D, 2D, and 3D PDEs, and (v) efficient inference for longer temporal rollouts. To address these limitations, we propose Vectorized Conditional Neural Fields (VCNeFs), which represent the solution of time-dependent PDEs as neural fields. Contrary to prior methods, however, VCNeFs compute, for a set of multiple spatio-temporal query points, their solutions in parallel and model their dependencies through attention mechanisms. Moreover, VCNeF can condition the neural field on both the initial conditions and the parameters of the PDEs. An extensive set of experiments demonstrates that VCNeFs are competitive with and often outperform existing ML-based surrogate models.
翻译:Transformer模型正越来越多地被用于求解偏微分方程(PDEs)。目前已提出多种改进方案,但均存在Transformer的典型问题,如二次复杂度的内存占用与时间复杂度。此外,当前主流的PDE求解架构普遍缺乏理想替代模型应具备的若干特性,包括:(i) 对未在训练中出现的PDE参数的泛化能力;(ii) 时空零样本超分辨率;(iii) 连续时间外推;(iv) 支持一维、二维及三维PDE问题;(v) 长时间滚动推理的高效计算。为解决上述局限,我们提出向量化条件神经场(VCNeFs),将时变PDE的解表示为神经场。与现有方法不同,VCNeFs可并行计算多个时空查询点的解,并通过注意力机制建模其依赖关系。此外,VCNeFs能基于初始条件和PDE参数对神经场进行条件约束。大量实验证明,VCNeFs在性能上与现有基于机器学习的替代模型相当,且通常表现更优。