Bayesian optimization (BO) is a sequential approach for optimizing black-box objective functions using zeroth-order noisy observations. In BO, Gaussian processes (GPs) are employed as probabilistic surrogate models to estimate the objective function based on past observations, guiding the selection of future queries to maximize utility. However, the performance of BO heavily relies on the quality of these probabilistic estimates, which can deteriorate significantly under model misspecification. To address this issue, we introduce localized online conformal prediction-based Bayesian optimization (LOCBO), a BO algorithm that calibrates the GP model through localized online conformal prediction (CP). LOCBO corrects the GP likelihood based on predictive sets produced by LOCBO, and the corrected GP likelihood is then denoised to obtain a calibrated posterior distribution on the objective function. The likelihood calibration step leverages an input-dependent calibration threshold to tailor coverage guarantees to different regions of the input space. Under minimal noise assumptions, we provide theoretical performance guarantees for LOCBO's iterates that hold for the unobserved objective function. These theoretical findings are validated through experiments on synthetic and real-world optimization tasks, demonstrating that LOCBO consistently outperforms state-of-the-art BO algorithms in the presence of model misspecification.
翻译:贝叶斯优化(BO)是一种利用零阶噪声观测序列化优化黑箱目标函数的方法。在BO中,高斯过程(GP)被用作概率代理模型,基于历史观测估计目标函数,从而指导未来查询点的选择以最大化效用。然而,BO的性能严重依赖于这些概率估计的质量,在模型设定错误的情况下,估计质量可能显著恶化。为解决这一问题,我们提出了基于局部在线共形预测的贝叶斯优化(LOCBO),该算法通过局部在线共形预测(CP)对GP模型进行校准。LOCBO基于其自身产生的预测集校正GP似然,随后对校正后的GP似然进行去噪处理,以获得目标函数的校准后验分布。似然校准步骤利用一个与输入相关的校准阈值,为输入空间的不同区域定制覆盖保证。在最小噪声假设下,我们为LOCBO迭代过程针对未观测目标函数提供了理论性能保证。这些理论结果在合成及现实世界优化任务实验中得到了验证,表明在模型设定错误存在时,LOCBO始终优于最先进的BO算法。