We study non-linear Bayesian inverse problems arising from semilinear partial differential equations (PDEs) that can be transformed into linear Bayesian inverse problems. We are then able to extend the early stopping for Ensemble Kalman-Bucy Filter (EnKBF) to these types of linearisable nonlinear problems as a way to tune the prior distribution. Using the linearisation method introduced in \cite{koers2024}, we transform the non-linear problem into a linear one, apply early stopping based on the discrepancy principle, and then pull back the resulting posterior to the posterior for the original parameter of interest. Following \cite{tienstra2025}, we show that this approach yields adaptive posterior contraction rates and frequentist coverage guarantees, under mild conditions on the prior covariance operator. From this, it immediately follows that Tikhonov regularisation coupled with the discrepancy principle contracts at the same rate. The proposed method thus provides a data-driven way to tune Gaussian priors via early stopping, which is both computationally efficient and statistically near optimal for nonlinear problems. Lastly, we demonstrate our results theoretically and numerically for the classical benchmark problem, the time-independent Schr\"odinger equation.
翻译:我们研究源自半线性偏微分方程(PDE)的非线性贝叶斯反问题,此类问题可转化为线性贝叶斯反问题。基于此,我们将集成卡尔曼-布西滤波器(EnKBF)的早停策略扩展至这类可线性化的非线性问题,作为先验分布调参的一种手段。利用 \cite{koers2024} 中引入的线性化方法,我们将非线性问题转化为线性问题,基于偏差原理实施早停,再将所得后验结果映射回原始感兴趣参数的后验分布。依据 \cite{tienstra2025} 的研究,我们证明在先验协方差算子满足温和条件下,该方法能获得自适应后验收缩率与频率学派覆盖保证。由此可直接推得:结合偏差原理的吉洪诺夫正则化具有相同的收缩率。因此,所提方法通过早停机制为高斯先验提供了数据驱动的调参方式,该方式兼具计算高效性与统计近似最优性,适用于非线性问题。最后,我们以经典基准问题——稳态薛定谔方程为对象,从理论与数值计算两个层面验证了所提结果。