Bayesian Optimization (BO) is a powerful method for optimizing black-box functions by combining prior knowledge with ongoing function evaluations. BO constructs a probabilistic surrogate model of the objective function given the covariates, which is in turn used to inform the selection of future evaluation points through an acquisition function. For smooth continuous search spaces, Gaussian Processes (GPs) are commonly used as the surrogate model as they offer analytical access to posterior predictive distributions, thus facilitating the computation and optimization of acquisition functions. However, in complex scenarios involving optimization over categorical or mixed covariate spaces, GPs may not be ideal. This paper introduces Simulation Based Bayesian Optimization (SBBO) as a novel approach to optimizing acquisition functions that only requires sampling-based access to posterior predictive distributions. SBBO allows the use of surrogate probabilistic models tailored for combinatorial spaces with discrete variables. Any Bayesian model in which posterior inference is carried out through Markov chain Monte Carlo can be selected as the surrogate model in SBBO. We demonstrate empirically the effectiveness of SBBO using various choices of surrogate models in applications involving combinatorial optimization. choices of surrogate models.
翻译:贝叶斯优化(BO)是一种通过结合先验知识与持续函数评估来优化黑盒函数的强大方法。BO根据协变量构建目标函数的概率代理模型,该模型随后通过采集函数指导未来评估点的选择。对于光滑连续搜索空间,高斯过程(GP)通常被用作代理模型,因为它们能够解析地获取后验预测分布,从而便于采集函数的计算与优化。然而,在涉及分类或混合协变量空间的复杂优化场景中,GP可能并非理想选择。本文提出基于仿真的贝叶斯优化(SBBO)作为一种新颖方法,该方法仅需基于采样的后验预测分布即可优化采集函数。SBBO允许使用专为离散变量组合空间设计的概率代理模型。任何通过马尔可夫链蒙特卡洛进行后验推断的贝叶斯模型均可被选作SBBO的代理模型。我们通过组合优化应用中的多种代理模型选择,实证验证了SBBO的有效性。