Parallelisation in Bayesian optimisation is a common strategy but faces several challenges: the need for flexibility in acquisition functions and kernel choices, flexibility dealing with discrete and continuous variables simultaneously, model misspecification, and lastly fast massive parallelisation. To address these challenges, we introduce a versatile and modular framework for batch Bayesian optimisation via probabilistic lifting with kernel quadrature, called SOBER, which we present as a Python library based on GPyTorch/BoTorch. Our framework offers the following unique benefits: (1) Versatility in downstream tasks under a unified approach. (2) A gradient-free sampler, which does not require the gradient of acquisition functions, offering domain-agnostic sampling (e.g., discrete and mixed variables, non-Euclidean space). (3) Flexibility in domain prior distribution. (4) Adaptive batch size (autonomous determination of the optimal batch size). (5) Robustness against a misspecified reproducing kernel Hilbert space. (6) Natural stopping criterion.
翻译:贝叶斯优化中的并行化是一种常见策略,但面临若干挑战:采集函数与核函数选择的灵活性需求、离散与连续变量的灵活协同处理、模型错误规范以及快速大规模并行化。为解决这些问题,我们提出了一种基于核求积概率提升的通用模块化批量贝叶斯优化框架SOBER,并以基于GPyTorch/BoTorch的Python库形式呈现。该框架具备以下独特优势:(1)在统一框架下实现下游任务的通用性;(2)无需采集函数梯度的无梯度采样器,支持领域无关采样(例如离散及混合变量、非欧空间);(3)领域先验分布的灵活性;(4)自适应批量大小(自主确定最优批量规模);(5)对错误指定的再生核希尔伯特空间的鲁棒性;(6)自然停止准则。