In this work, we present an adjoint-based method for discovering the underlying governing partial differential equations (PDEs) given data. The idea is to consider a parameterized PDE in a general form, and formulate the optimization problem that minimizes the error of PDE solution from data. Using variational calculus, we obtain an evolution equation for the Lagrange multipliers (adjoint equations) allowing us to compute the gradient of the objective function with respect to the parameters of PDEs given data in a straightforward manner. In particular, for a family of parameterized and nonlinear PDEs, we show how the corresponding adjoint equations can be derived. Here, we show that given smooth data set, the proposed adjoint method can recover the true PDE up to machine accuracy. However, in the presence of noise, the accuracy of the adjoint method becomes comparable to the famous PDE Functional Identification of Nonlinear Dynamics method known as PDE-FIND (Rudy et al., 2017). Even though the presented adjoint method relies on forward/backward solvers, it outperforms PDE-FIND for large data sets thanks to the analytic expressions for gradients of the cost function with respect to each PDE parameter.
翻译:本文提出了一种基于伴随的方法,用于从数据中发现潜在的控制偏微分方程(PDE)。其核心思想是考虑一般形式的参数化PDE,并构建优化问题以最小化PDE解与数据之间的误差。通过变分法,我们推导出拉格朗日乘子(伴随方程)的演化方程,从而能够直接计算目标函数关于PDE参数相对于数据的梯度。特别地,针对一族参数化非线性PDE,我们展示了如何推导相应的伴随方程。研究表明,对于平滑数据集,所提出的伴随方法能够以机器精度恢复真实PDE。然而,当存在噪声时,该伴随方法的精度与著名的非线性动力学PDE函数识别方法(即PDE-FIND,Rudy等人,2017年)相当。尽管所提出的伴随方法依赖前向/后向求解器,但由于目标函数关于每个PDE参数的梯度具有解析表达式,该方法在处理大规模数据集时优于PDE-FIND。