Black-box optimization problems often require simultaneously optimizing different types of variables, such as continuous, integer, and categorical variables. Unlike integer variables, categorical variables do not necessarily have a meaningful order, and the discretization approach of continuous variables does not work well. Although several Bayesian optimization methods can deal with mixed-category black-box optimization (MC-BBO), they suffer from a lack of scalability to high-dimensional problems and internal computational cost. This paper proposes CatCMA, a stochastic optimization method for MC-BBO problems, which employs the joint probability distribution of multivariate Gaussian and categorical distributions as the search distribution. CatCMA updates the parameters of the joint probability distribution in the natural gradient direction. CatCMA also incorporates the acceleration techniques used in the covariance matrix adaptation evolution strategy (CMA-ES) and the stochastic natural gradient method, such as step-size adaptation and learning rate adaptation. In addition, we restrict the ranges of the categorical distribution parameters by margin to prevent premature convergence and analytically derive a promising margin setting. Numerical experiments show that the performance of CatCMA is superior and more robust to problem dimensions compared to state-of-the-art Bayesian optimization algorithms.
翻译:摘要:黑箱优化问题通常需要同时优化不同类型的变量,例如连续变量、整数变量和类别变量。与整数变量不同,类别变量不一定具有有意义的顺序,且对连续变量进行离散化的方法效果不佳。尽管多种贝叶斯优化方法能够处理混合类别黑箱优化(MC-BBO)问题,但它们在高维问题上存在可扩展性不足及内部计算成本高昂的缺陷。本文提出CatCMA——一种针对MC-BBO问题的随机优化方法,该方法采用多元高斯分布与类别分布的联合概率分布作为搜索分布。CatCMA沿着自然梯度方向更新联合概率分布的参数,同时整合了协方差矩阵自适应进化策略(CMA-ES)与随机自然梯度方法中的加速技术,例如步长自适应与学习率自适应。此外,我们通过边际约束限制类别分布参数的取值范围,以防止过早收敛,并解析推导出具有前景的边际设置。数值实验表明,与最先进的贝叶斯优化算法相比,CatCMA在性能上更优,且对问题维度具有更强的鲁棒性。