For linear inverse problems with Gaussian priors and observation noise, the posterior is Gaussian, with mean and covariance determined by the conditioning formula. We analyse measure approximation problems of finding the best approximation to the posterior in a family of Gaussians with approximate covariance or approximate mean, for Hilbert parameter spaces and finite-dimensional observations. We quantify the error of the approximating Gaussian either with the Kullback-Leibler divergence or the family of R\'{e}nyi divergences. Using the Feldman-Hajek theorem and recent results on reduced-rank operator approximations, we identify optimal solutions to these measure approximation problems. Our results extend those of Spantini et al. (SIAM J. Sci. Comput. 2015) to Hilbertian parameter spaces. In addition, our results show that the posterior differs from the prior only on a subspace of dimension equal to the rank of the Hessian of the negative log-likelihood, and that this subspace is a subspace of the Cameron-Martin space of the prior.
翻译:对于具有高斯先验和观测噪声的线性反问题,后验分布为高斯分布,其均值和协方差由条件公式确定。我们分析了在希尔伯特参数空间和有限维观测条件下,寻找具有近似协方差或近似均值的高斯分布族中后验分布最佳逼近的测度逼近问题。我们使用Kullback-Leibler散度或Rényi散度族来量化逼近高斯分布的误差。利用Feldman-Hajek定理和近期关于降秩算子逼近的研究成果,我们确定了这些测度逼近问题的最优解。我们的结果将Spantini等人(SIAM J. Sci. Comput. 2015)的研究推广到了希尔伯特参数空间。此外,我们的结果表明后验分布与先验分布仅在对数似然负函数的Hessian矩阵秩维度的子空间上存在差异,且该子空间是先验分布的Cameron-Martin空间的子空间。