In statistical inference, a discrepancy between the parameter-to-observable map that generates the data and the parameter-to-observable map that is used for inference can lead to misspecified likelihoods and thus to incorrect estimates. In many inverse problems, the parameter-to-observable map is the composition of a linear state-to-observable map called an `observation operator' and a possibly nonlinear parameter-to-state map called the `model'. We consider such Bayesian inverse problems where the discrepancy in the parameter-to-observable map is due to the use of an approximate model that differs from the best model, i.e. to nonzero `model error'. Multiple approaches have been proposed to address such discrepancies, each leading to a specific posterior. We show how to use local Lipschitz stability estimates of posteriors with respect to likelihood perturbations to bound the Kullback--Leibler divergence of the posterior of each approach with respect to the posterior associated to the best model. Our bounds lead to criteria for choosing observation operators that mitigate the effect of model error for Bayesian inverse problems of this type. We illustrate one such criterion on an advection-diffusion-reaction PDE inverse problem from the literature, and use this example to discuss the importance and challenges of model error-aware inference.
翻译:在统计推断中,生成数据所用的参数到观测映射与推断所用的参数到观测映射之间的差异可能导致似然函数设定错误,从而产生不正确的估计。在许多反问题中,参数到观测映射由一个称为“观测算子”的线性状态到观测映射与一个可能非线性的称为“模型”的参数到状态映射复合而成。我们考虑此类贝叶斯反问题,其中参数到观测映射的差异源于使用了一个与最佳模型不同的近似模型,即源于非零的“模型误差”。已有多种方法被提出来处理此类差异,每种方法都会产生一个特定的后验分布。我们展示了如何利用后验分布关于似然扰动的局部Lipschitz稳定性估计,来界定每种方法的后验分布与最佳模型对应的后验分布之间的Kullback–Leibler散度。我们的界导出了选择观测算子的准则,以减轻此类贝叶斯反问题中模型误差的影响。我们在一个文献中的对流-扩散-反应偏微分方程反问题上说明了其中一条准则,并借此例讨论了模型误差感知推断的重要性与挑战。