Gaussian Process (GP) regression is a powerful nonparametric Bayesian framework, but its performance depends critically on the choice of covariance kernel. Selecting an appropriate kernel is therefore central to model quality, yet remains one of the most challenging and computationally expensive steps in probabilistic modeling. We present a Bayesian optimization framework built on kernel-of-kernels geometry, using expected divergence-based distances between GP priors to explore kernel space efficiently. A multidimensional scaling (MDS) embedding of this distance matrix maps a discrete kernel library into a continuous Euclidean manifold, enabling smooth BO. In this formulation, the input space comprises kernel compositions, the objective is the log marginal likelihood, and featurization is given by the MDS coordinates. When the divergence yields a valid metric, the embedding preserves geometry and produces a stable BO landscape. We demonstrate the approach on synthetic benchmarks, real-world time-series datasets, and an additive manufacturing case study predicting melt-pool geometry, achieving superior predictive accuracy and uncertainty calibration relative to baselines including Large Language Model (LLM)-guided search. This framework establishes a reusable probabilistic geometry for kernel search, with direct relevance to GP modeling and deep kernel learning.
翻译:高斯过程(GP)回归是一种强大的非参数贝叶斯框架,但其性能关键取决于协方差核的选择。因此,选择合适的核对于模型质量至关重要,但这仍然是概率建模中最具挑战性且计算成本最高的步骤之一。我们提出了一种基于核之核几何的贝叶斯优化框架,利用GP先验之间基于期望散度的距离来高效探索核空间。通过对该距离矩阵进行多维尺度分析(MDS)嵌入,将离散核库映射到一个连续的欧几里得流形中,从而实现平滑的贝叶斯优化。在此公式中,输入空间由核组合构成,目标函数为对数边际似然,而特征化则由MDS坐标给出。当散度产生一个有效度量时,该嵌入能保持几何结构并产生稳定的贝叶斯优化景观。我们在合成基准测试、真实世界时间序列数据集以及一个预测熔池几何形状的增材制造案例研究上验证了该方法,相较于包括大型语言模型(LLM)引导搜索在内的基线方法,取得了更优的预测精度和不确定性校准。该框架为核搜索建立了一种可复用的概率几何,对GP建模和深度核学习具有直接相关性。