In this paper, we study the neural tangent kernel (NTK) for general partial differential equations (PDEs) based on physics-informed neural networks (PINNs). As we all know, the training of an artificial neural network can be converted to the evolution of NTK. We analyze the initialization of NTK and the convergence conditions of NTK during training for general PDEs. The theoretical results show that the homogeneity of differential operators plays a crucial role for the convergence of NTK. Moreover, based on the PINNs, we validate the convergence conditions of NTK using the initial value problems of the sine-Gordon equation and the initial-boundary value problem of the KdV equation.
翻译:本文基于物理信息神经网络(PINNs)研究了一般偏微分方程(PDEs)的神经正切核(NTK)。众所周知,人工神经网络的训练可以转化为NTK的演化。我们分析了一般PDEs中NTK的初始化及其在训练过程中的收敛条件。理论结果表明,微分算子的齐次性对NTK的收敛起着至关重要的作用。此外,基于PINNs,我们利用sine-Gordon方程的初值问题和KdV方程的初边值问题验证了NTK的收敛条件。