We study the training and performance of physics-informed learning for initial and boundary value problems (IBVP) with physics-informed neural networks (PINNs) from a statistical learning perspective. Specifically, we restrict ourselves to parameterizations with hard initial and boundary condition constraints and reformulate the problem of estimating PINN parameters as a statistical learning problem. From this perspective, the physics penalty on the IBVP residuals can be better understood not as a regularizing term bus as an infinite source of indirect data, and the learning process as fitting the PINN distribution of residuals $p(y \mid x, t, w) q(x, t) $ to the true data-generating distribution $δ(0) q(x, t)$ by minimizing the Kullback-Leibler divergence between the true and PINN distributions. Furthermore, this analysis show that physics-informed learning with PINNs is a singular learning problem, and we employ singular learning theory tools, namely the so-called Local Learning Coefficient (Lau et al., 2025) to analyze the estimates of PINN parameters obtained via stochastic optimization for a heat equation IBVP. Finally, we discuss implications of this analysis on the quantification of predictive uncertainty of PINNs and the extrapolation capacity of PINNs.
翻译:我们从统计学习的角度研究了针对初边值问题的物理信息学习训练与性能表现,特别是采用物理信息神经网络的情况。具体而言,我们将参数化限制在具有硬性初边和边界条件约束的框架内,并将估计PINN参数的问题重新表述为一个统计学习问题。从这个视角出发,对IBVP残差的物理惩罚项可以更好地被理解为并非一个正则化项,而是一个无限的间接数据源;学习过程则被视为通过最小化真实数据生成分布 $δ(0) q(x, t)$ 与PINN残差分布 $p(y \mid x, t, w) q(x, t)$ 之间的Kullback-Leibler散度,使后者拟合前者。此外,该分析表明,使用PINN进行物理信息学习是一个奇异学习问题。我们运用奇异学习理论工具,即所谓的局部学习系数,来分析通过随机优化方法针对热方程IBVP所获得的PINN参数估计。最后,我们讨论了这一分析对于量化PINN预测不确定性以及PINN外推能力的影响。