We consider the statistical inverse problem of recovering a parameter $\theta\in H^\alpha$ from data arising from the Gaussian regression problem \begin{equation*} Y = \mathscr{G}(\theta)(Z)+\varepsilon \end{equation*} with nonlinear forward map $\mathscr{G}:\mathbb{L}^2\to\mathbb{L}^2$, random design points $Z$ and Gaussian noise $\varepsilon$. The estimation strategy is based on a least squares approach under $\Vert\cdot\Vert_{H^\alpha}$-constraints. We establish the existence of a least squares estimator $\hat{\theta}$ as a maximizer for a given functional under Lipschitz-type assumptions on the forward map $\mathscr{G}$. A general concentration result is shown, which is used to prove consistency and upper bounds for the prediction error. The corresponding rates of convergence reflect not only the smoothness of the parameter of interest but also the ill-posedness of the underlying inverse problem. We apply the general model to the Darcy problem, where the recovery of an unknown coefficient function $f$ of a PDE is of interest. For this example, we also provide corresponding rates of convergence for the prediction and estimation errors. Additionally, we briefly discuss the applicability of the general model to other problems.
翻译:我们考虑从高斯回归问题数据中恢复参数$\theta\in H^\alpha$的统计反问题,该问题形式为\begin{equation*} Y = \mathscr{G}(\theta)(Z)+\varepsilon \end{equation*},其中包含非线性前向映射$\mathscr{G}:\mathbb{L}^2\to\mathbb{L}^2$、随机设计点$Z$及高斯噪声$\varepsilon$。估计策略基于$\Vert\cdot\Vert_{H^\alpha}$约束下的最小二乘法。在前向映射$\mathscr{G}$满足Lipschitz型假设的条件下,我们证明了最小二乘估计器$\hat{\theta}$作为给定泛函极大化元的存在性。文中展示了一个广义集中性结果,并以此证明预测误差的一致性与上界。相应的收敛速率不仅反映了目标参数的光滑性,也体现了底层反问题的不适定性。我们将该通用模型应用于达西问题,该问题关注偏微分方程未知系数函数$f$的重构。在此示例中,我们进一步给出了预测误差与估计误差的对应收敛速率。此外,我们简要讨论了该通用模型在其他问题中的适用性。