This work investigates the use of shallow physics-informed neural networks (PINNs) for solving forward and inverse problems of nonlinear partial differential equations (PDEs). By reformulating PINNs as nonlinear systems, the Levenberg-Marquardt (LM) algorithm is employed to efficiently optimize the network parameters. Analytical expressions for the neural network derivatives with respect to the input variables are derived, enabling accurate and efficient computation of the Jacobian matrix required by LM. The proposed approach is tested on several benchmark problems, including the Burgers, Schrödinger, Allen-Cahn, and three-dimensional Bratu equations. Numerical results demonstrate that LM significantly outperforms BFGS in terms of convergence speed, accuracy, and final loss values, even when using shallow network architectures with only two hidden layers. These findings indicate that, for a wide class of PDEs, shallow PINNs combined with efficient second-order optimization methods can provide accurate and computationally efficient solutions for both forward and inverse problems.
翻译:本研究探讨了使用浅层物理信息神经网络(PINNs)求解非线性偏微分方程(PDEs)正反问题的方法。通过将PINNs重构为非线性系统,采用Levenberg-Marquardt(LM)算法高效优化网络参数。推导了神经网络对输入变量的导数解析表达式,从而能够准确高效地计算LM算法所需的雅可比矩阵。所提方法在多个基准问题上进行了测试,包括Burgers方程、Schrödinger方程、Allen-Cahn方程以及三维Bratu方程。数值结果表明,即使在使用仅含两个隐藏层的浅层网络结构时,LM算法在收敛速度、精度和最终损失值方面均显著优于BFGS算法。这些发现表明,对于一大类偏微分方程问题,结合高效二阶优化方法的浅层PINNs能够为正向和反向问题提供精确且计算高效的解决方案。