Parallelisation in Bayesian optimisation is a common strategy but faces several challenges: the need for flexibility in acquisition functions and kernel choices, flexibility dealing with discrete and continuous variables simultaneously, model misspecification, and lastly fast massive parallelisation. To address these challenges, we introduce a versatile and modular framework for batch Bayesian optimisation via probabilistic lifting with kernel quadrature, called SOBER, which we present as a Python library based on GPyTorch/BoTorch. Our framework offers the following unique benefits: (1) Versatility in downstream tasks under a unified approach. (2) A gradient-free sampler, which does not require the gradient of acquisition functions, offering domain-agnostic sampling (e.g., discrete and mixed variables, non-Euclidean space). (3) Flexibility in domain prior distribution. (4) Adaptive batch size (autonomous determination of the optimal batch size). (5) Robustness against a misspecified reproducing kernel Hilbert space. (6) Natural stopping criterion.
翻译:贝叶斯优化中的并行化是一种常见策略,但面临多项挑战:采集函数与核函数选择需要灵活性、需同时处理离散与连续变量、模型设定错误问题,以及快速大规模并行化的需求。为应对这些挑战,我们提出了一种基于核求积概率提升的通用模块化批量贝叶斯优化框架——SOBER,并以基于GPyTorch/BoTorch的Python库形式呈现。该框架具备以下独特优势:(1) 统一框架下适配多样化下游任务;(2) 无需采集函数梯度的无梯度采样器,支持领域无关采样(如离散变量、混合变量、非欧几里得空间);(3) 领域先验分布选择的灵活性;(4) 自适应批量规模(自主确定最优批量规模);(5) 对错误设定的再生核希尔伯特空间具有鲁棒性;(6) 自然终止准则。