Physics-informed machine learning (PIML) has emerged as a promising alternative to conventional numerical methods for solving partial differential equations (PDEs). PIML models are increasingly built via deep neural networks (NNs) whose architecture and training process are designed such that the network satisfies the PDE system. While such PIML models have substantially advanced over the past few years, their performance is still very sensitive to the NN's architecture and loss function. Motivated by this limitation, we introduce kernel-weighted Corrective Residuals (CoRes) to integrate the strengths of kernel methods and deep NNs for solving nonlinear PDE systems. To achieve this integration, we design a modular and robust framework which consistently outperforms competing methods in solving a broad range of benchmark problems. This performance improvement has a theoretical justification and is particularly attractive since we simplify the training process while negligibly increasing the inference costs. Additionally, our studies on solving multiple PDEs indicate that kernel-weighted CoRes considerably decrease the sensitivity of NNs to factors such as random initialization, architecture type, and choice of optimizer. We believe our findings have the potential to spark a renewed interest in leveraging kernel methods for solving PDEs.
翻译:物理信息机器学习(PIML)已成为求解偏微分方程(PDE)传统数值方法的一种有前景的替代方案。PIML模型越来越多地通过深度神经网络(NN)构建,其架构和训练过程经过专门设计,以确保网络满足PDE系统。尽管此类PIML模型在过去几年取得了显著进展,但其性能仍对神经网络的架构和损失函数高度敏感。受此局限性的启发,我们引入了核加权校正残差(CoRes),以整合核方法与深度神经网络在求解非线性PDE系统方面的优势。为实现这一整合,我们设计了一个模块化且鲁棒的框架,该框架在求解广泛基准问题方面持续优于现有方法。这种性能提升具有理论依据,并且由于我们在可忽略增加推理成本的同时简化了训练过程,因而尤为引人注目。此外,我们在求解多个PDE的研究表明,核加权CoRes显著降低了神经网络对随机初始化、架构类型和优化器选择等因素的敏感性。我们相信,我们的发现有望重新激发利用核方法求解PDE的研究兴趣。