Physics-informed machine learning combines the expressiveness of data-based approaches with the interpretability of physical models. In this context, we consider a general regression problem where the empirical risk is regularized by a partial differential equation that quantifies the physical inconsistency. We prove that for linear differential priors, the problem can be formulated as a kernel regression task. Taking advantage of kernel theory, we derive convergence rates for the minimizer of the regularized risk and show that it converges at least at the Sobolev minimax rate. However, faster rates can be achieved, depending on the physical error. This principle is illustrated with a one-dimensional example, supporting the claim that regularizing the empirical risk with physical information can be beneficial to the statistical performance of estimators.
翻译:物理信息机器学习结合了基于数据方法的表达能力和物理模型的可解释性。在此背景下,我们考虑一个广义回归问题,其中经验风险通过量化物理不一致性的偏微分方程进行正则化。我们证明,对于线性微分先验,该问题可表述为一个核回归任务。利用核理论,我们推导了正则化风险最小化子的收敛速率,并证明其至少以Sobolev极小极大速率收敛。然而,根据物理误差的不同,可以实现更快的收敛速率。这一原理通过一维示例加以说明,支持了以下论断:利用物理信息对经验风险进行正则化有助于提升估计量的统计性能。