Solving parametric Partial Differential Equations (PDEs) for a broad range of parameters is a critical challenge in scientific computing. To this end, neural operators, which \textcolor{black}{predicts the PDE solution with variable PDE parameter inputs}, have been successfully used. However, the training of neural operators typically demands large training datasets, the acquisition of which can be prohibitively expensive. To address this challenge, physics-informed training can offer a cost-effective strategy. However, current physics-informed neural operators face limitations, either in handling irregular domain shapes or in in generalizing to various discrete representations of PDE parameters. In this research, we introduce a novel physics-informed model architecture which can generalize to various discrete representations of PDE parameters and irregular domain shapes. Particularly, inspired by deep operator neural networks, our model involves a discretization-independent learning of parameter embedding repeatedly, and this parameter embedding is integrated with the response embeddings through multiple compositional layers, for more expressivity. Numerical results demonstrate the accuracy and efficiency of the proposed method. All the codes and data related to this work are available on GitHub: https://github.com/WeihengZ/PI-DCON.
翻译:求解广泛参数范围内的参数化偏微分方程(PDE)是科学计算中的一个关键挑战。为此,神经算子——其能够根据变化的PDE参数输入预测PDE解——已得到成功应用。然而,神经算子的训练通常需要大量训练数据集,而获取这些数据集的成本可能极其高昂。为应对这一挑战,物理信息训练可提供一种经济高效的策略。然而,当前的物理信息神经算子在处理不规则区域形状或泛化到PDE参数的各种离散表示方面存在局限。在本研究中,我们提出了一种新颖的物理信息模型架构,该架构能够泛化到PDE参数的各种离散表示以及不规则区域形状。具体而言,受深度算子神经网络的启发,我们的模型涉及对参数嵌入进行重复的、离散化无关的学习,并且该参数嵌入通过多个复合层与响应嵌入相集成,从而获得更强的表达能力。数值结果证明了所提方法的准确性和效率。与本工作相关的所有代码和数据均可在GitHub上获取:https://github.com/WeihengZ/PI-DCON。