Derivative-free optimization algorithms play an important role in scientific and engineering design optimization problems, especially when derivative information is not accessible. In this paper, we study the framework of sequential classification-based derivative-free optimization algorithms. By introducing learning theoretic concept hypothesis-target shattering rate, we revisit the computational complexity upper bound of SRACOS (Hu, Qian, and Yu 2017). Inspired by the revisited upper bound, we propose an algorithm named RACE-CARS, which adds a random region-shrinking step compared with SRACOS. We further establish theorems showing the acceleration by region shrinking. Experiments on the synthetic functions as well as black-box tuning for language-model-as-a-service demonstrate empirically the efficiency of RACE-CARS. An ablation experiment on the introduced hyperparameters is also conducted, revealing the mechanism of RACE-CARS and putting forward an empirical hyper-parameter tuning guidance.
翻译:无导数优化算法在科学与工程设计优化问题中扮演着重要角色,尤其是在梯度信息不可获取的情况下。本文研究基于顺序分类的无导数优化算法框架。通过引入学习理论中的假设-目标破碎率概念,我们重新审视了SRACOS算法的计算复杂度上界。受重新推导的上界启发,我们提出了一种名为RACE-CARS的算法,该算法在SRACOS基础上增加了随机区域收缩步骤。我们进一步建立了理论证明区域收缩带来的加速效果。在合成函数以及语言模型即服务的黑盒调参实验上,实证结果验证了RACE-CARS的高效性。本文还对引入的超参数进行了消融实验,揭示了RACE-CARS的作用机制,并提出了实用的超参数调优指导原则。