This paper introduces PDEformer-1, a versatile neural solver capable of simultaneously addressing various partial differential equations (PDEs). With the PDE represented as a computational graph, we facilitate the seamless integration of symbolic and numeric information inherent in a PDE. A graph Transformer and an implicit neural representation (INR) are employed subsequently to generate mesh-free predicted solutions. We generated a dataset with up to three million samples involving diverse one-dimensional PDEs to pretrain our model. Compared with baseline models trained specifically on benchmark datasets, our pretrained model achieves comparable accuracy via zero-shot inference, and the advantage expands after finetuning. For PDEs new or unseen in the pretraining stage, our model can adapt quickly by finetuning on a relatively small set of examples from the target equation. Additionally, PDEformer-1 demonstrates promising results in the inverse problem of PDE scalar coefficient recovery and coefficient field recovery.
翻译:本文介绍了PDEformer-1,一种能够同时求解各类偏微分方程(PDEs)的通用神经求解器。通过将PDE表示为计算图,我们促进了PDE中符号与数值信息的无缝融合。随后采用图Transformer与隐式神经表示(INR)来生成无网格的预测解。我们构建了一个包含多达三百万个样本的数据集,涵盖多样的一维PDEs,用于预训练模型。与在基准数据集上专门训练的基线模型相比,我们的预训练模型通过零样本推理达到了相当的精度,且经过微调后优势进一步扩大。对于预训练阶段未见过的新PDEs,我们的模型可通过在目标方程的小规模示例集上进行微调快速适应。此外,PDEformer-1在PDE标量系数反演与系数场反演问题中亦展现出良好的性能。