High-dimensional problems have long been considered the Achilles' heel of Bayesian optimization algorithms. Spurred by the curse of dimensionality, a large collection of algorithms aim to make it more performant in this setting, commonly by imposing various simplifying assumptions on the objective. In this paper, we identify the degeneracies that make vanilla Bayesian optimization poorly suited to high-dimensional tasks, and further show how existing algorithms address these degeneracies through the lens of lowering the model complexity. Moreover, we propose an enhancement to the prior assumptions that are typical to vanilla Bayesian optimization algorithms, which reduces the complexity to manageable levels without imposing structural restrictions on the objective. Our modification - a simple scaling of the Gaussian process lengthscale prior with the dimensionality - reveals that standard Bayesian optimization works drastically better than previously thought in high dimensions, clearly outperforming existing state-of-the-art algorithms on multiple commonly considered real-world high-dimensional tasks.
翻译:高维问题长期以来被视为贝叶斯优化算法的致命弱点。受维度诅咒的驱动,大量算法试图通过强加各种关于目标函数的简化假设来提升其在此场景下的性能。本文揭示了导致标准贝叶斯优化难以适应高维任务的本质缺陷,并进一步通过降低模型复杂度的视角阐释现有算法如何应对这些缺陷。此外,我们针对标准贝叶斯优化算法中典型的先验假设提出改进方案,该方案能在不对目标函数施加结构性限制的前提下,将复杂度降至可处理水平。我们的改进——对高斯过程长度尺度先验进行简单的维度缩放——表明标准贝叶斯优化在高维空间中的表现远超既往认知,在多个公认的真实世界高维任务中明显优于现有最先进算法。