In this work we derive higher order error estimates for inverse problems distorted by non-additive noise, in terms of Bregman distances. The results are obtained by means of a novel source condition, inspired by the dual problem. Specifically, we focus on variational regularization having the Kullback-Leibler divergence as data-fidelity, and a convex penalty term. In this framework, we provide an interpretation of the new source condition, and present error estimates also when a variational formulation of the source condition is employed. We show that this approach can be extended to variational regularization that incorporates more general convex data fidelities.
翻译:本文针对受非加性噪声干扰的逆问题,基于Bregman距离推导了高阶误差估计。研究结果通过一种受对偶问题启发的新型源条件获得。具体而言,我们聚焦于以Kullback-Leibler散度作为数据保真项、并包含凸惩罚项的变分正则化方法。在此框架下,我们阐释了新源条件的数学意义,并给出了采用源条件变分表述时的误差估计。研究表明,该方法可推广至包含更广义凸数据保真项的变分正则化框架。