Gaussian Processes (GPs) are widely seen as the state-of-the-art surrogate models for Bayesian optimization (BO) due to their ability to model uncertainty and their performance on tasks where correlations are easily captured (such as those defined by Euclidean metrics) and their ability to be efficiently updated online. However, the performance of GPs depends on the choice of kernel, and kernel selection for complex correlation structures is often difficult or must be made bespoke. While Bayesian neural networks (BNNs) are a promising direction for higher capacity surrogate models, they have so far seen limited use due to poor performance on some problem types. In this paper, we propose an approach which shows competitive performance on many problem types, including some that BNNs typically struggle with. We build on variational Bayesian last layers (VBLLs), and connect training of these models to exact conditioning in GPs. We exploit this connection to develop an efficient online training algorithm that interleaves conditioning and optimization. Our findings suggest that VBLL networks significantly outperform GPs and other BNN architectures on tasks with complex input correlations, and match the performance of well-tuned GPs on established benchmark tasks.
翻译:高斯过程(GPs)因其能够建模不确定性、在相关性易于捕获的任务(例如由欧几里得度量定义的任务)上表现优异,以及能够在线高效更新,被广泛视为贝叶斯优化(BO)中最先进的代理模型。然而,高斯过程的性能取决于核函数的选择,而对于复杂相关结构的核选择通常较为困难或需要专门定制。虽然贝叶斯神经网络(BNNs)作为高容量代理模型是一个有前景的方向,但由于在某些问题类型上表现不佳,其应用迄今仍受限。本文提出一种方法,其在多种问题类型上展现出具有竞争力的性能,包括一些贝叶斯神经网络通常难以处理的问题。我们的方法基于变分贝叶斯末层(VBLLs),并将这些模型的训练与高斯过程中的精确条件化联系起来。我们利用这一联系,开发了一种高效的在线训练算法,该算法交替进行条件化与优化。我们的研究结果表明,在具有复杂输入相关性的任务上,VBLL网络显著优于高斯过程及其他贝叶斯神经网络架构,并在已确立的基准任务上达到了经过良好调优的高斯过程的性能水平。