Physics-informed neural networks (PINNs) are an increasingly popular class of techniques for the numerical solution of partial differential equations (PDEs), where neural networks are trained using loss functions regularized by relevant PDE terms to enforce physical constraints. We present a new class of PINNs called HyResPINNs, which augment traditional PINNs with adaptive hybrid residual blocks that combine the outputs of a standard neural network and a radial basis function (RBF) network. A key feature of our method is the inclusion of adaptive combination parameters within each residual block, which dynamically learn to weigh the contributions of the neural network and RBF network outputs. Additionally, adaptive connections between residual blocks allow for flexible information flow throughout the network. We show that HyResPINNs are more robust to training point locations and neural network architectures than traditional PINNs. Moreover, HyResPINNs offer orders of magnitude greater accuracy than competing methods on certain problems, with only modest increases in training costs. We demonstrate the strengths of our approach on challenging PDEs, including the Allen-Cahn equation and the Darcy-Flow equation. Our results suggest that HyResPINNs effectively bridge the gap between traditional numerical methods and modern machine learning-based solvers.
翻译:物理信息神经网络(PINNs)是一类日益流行的偏微分方程数值求解技术,其通过使用经相关偏微分方程项正则化的损失函数训练神经网络以强制物理约束。本文提出了一类称为HyResPINNs的新型PINNs,它通过自适应混合残差块增强了传统PINNs,该残差块结合了标准神经网络与径向基函数(RBF)网络的输出。我们方法的一个关键特征是在每个残差块内包含自适应组合参数,这些参数动态学习如何权衡神经网络与RBF网络输出的贡献。此外,残差块之间的自适应连接允许信息在整个网络中灵活流动。我们证明,与传统PINNs相比,HyResPINNs对训练点位置和神经网络架构具有更强的鲁棒性。此外,在某些问题上,HyResPINNs相比竞争方法提供了数量级更高的精度,而训练成本仅适度增加。我们在具有挑战性的偏微分方程(包括Allen-Cahn方程和Darcy-Flow方程)上展示了我们方法的优势。我们的结果表明,HyResPINNs有效地弥合了传统数值方法与现代基于机器学习的求解器之间的差距。