This paper explores challenges in training Physics-Informed Neural Networks (PINNs), emphasizing the role of the loss landscape in the training process. We examine difficulties in minimizing the PINN loss function, particularly due to ill-conditioning caused by differential operators in the residual term. We compare gradient-based optimizers Adam, L-BFGS, and their combination Adam+L-BFGS, showing the superiority of Adam+L-BFGS, and introduce a novel second-order optimizer, NysNewton-CG (NNCG), which significantly improves PINN performance. Theoretically, our work elucidates the connection between ill-conditioned differential operators and ill-conditioning in the PINN loss and shows the benefits of combining first- and second-order optimization methods. Our work presents valuable insights and more powerful optimization strategies for training PINNs, which could improve the utility of PINNs for solving difficult partial differential equations.
翻译:本文探讨了训练物理信息神经网络(PINNs)所面临的挑战,重点分析了损失函数景观在训练过程中的作用。我们研究了最小化PINN损失函数时遇到的困难,特别是由残差项中的微分算子引起的病态性问题。我们比较了基于梯度的优化器Adam、L-BFGS及其组合Adam+L-BFGS,证明了Adam+L-BFGS的优越性,并引入了一种新颖的二阶优化器NysNewton-CG(NNCG),该优化器显著提升了PINN的性能。理论上,我们的工作阐明了病态微分算子与PINN损失函数病态性之间的联系,并展示了结合一阶和二阶优化方法的好处。本研究为训练PINNs提供了有价值的见解和更强大的优化策略,有望提升PINNs在求解困难偏微分方程方面的实用性。