Bayesian Optimisation (BO) is a state-of-the-art global optimisation technique for black-box problems where derivative information is unavailable, and sample efficiency is crucial. However, improving the general scalability of BO has proved challenging. Here, we explore Latent Space Bayesian Optimisation (LSBO), that applies dimensionality reduction to perform BO in a reduced-dimensional subspace. While early LSBO methods used (linear) random projections (Wang et al., 2013), we employ Variational Autoencoders (VAEs) to manage more complex data structures and general DR tasks. Building on Grosnit et. al. (2021), we analyse the VAE-based LSBO framework, focusing on VAE retraining and deep metric loss. We suggest a few key corrections in their implementation, originally designed for tasks such as molecule generation, and reformulate the algorithm for broader optimisation purposes. Our numerical results show that structured latent manifolds improve BO performance. Additionally, we examine the use of the Mat\'{e}rn-$\frac{5}{2}$ kernel for Gaussian Processes in this LSBO context. We also integrate Sequential Domain Reduction (SDR), a standard global optimization efficiency strategy, into BO. SDR is included in a GPU-based environment using \textit{BoTorch}, both in the original and VAE-generated latent spaces, marking the first application of SDR within LSBO.
翻译:贝叶斯优化(BO)是一种先进的全局优化技术,适用于导数信息不可得且采样效率至关重要的黑箱问题。然而,提升BO的通用可扩展性已被证明具有挑战性。本文探讨了潜在空间贝叶斯优化(LSBO),该方法通过降维在降维子空间中执行BO。早期LSBO方法采用(线性)随机投影(Wang等人,2013),而我们使用变分自编码器(VAE)来处理更复杂的数据结构和通用降维任务。基于Grosnit等人(2021)的工作,我们分析了基于VAE的LSBO框架,重点关注VAE重训练和深度度量损失。我们对其最初为分子生成等任务设计的实现提出了若干关键修正,并将该算法重新表述以适用于更广泛的优化目的。数值结果表明,结构化的潜在流形提升了BO性能。此外,我们研究了在此LSBO背景下将Matérn-$\frac{5}{2}$核用于高斯过程。我们还将序列域缩减(SDR)——一种标准的全局优化效率策略——集成到BO中。SDR在基于GPU的环境中使用\textit{BoTorch}实现,涵盖原始空间和VAE生成的潜在空间,这标志着SDR在LSBO中的首次应用。