Bayesian optimization is a powerful framework for optimizing functions that are expensive or time-consuming to evaluate. Recent work has considered Bayesian optimization of function networks (BOFN), where the objective function is given by a network of functions, each taking as input the output of previous nodes in the network as well as additional parameters. Leveraging this network structure has been shown to yield significant performance improvements. Existing BOFN algorithms for general-purpose networks evaluate the full network at each iteration. However, many real-world applications allow for evaluating nodes individually. To exploit this, we propose a novel knowledge gradient acquisition function that chooses which node and corresponding inputs to evaluate in a cost-aware manner, thereby reducing query costs by evaluating only on a part of the network at each step. We provide an efficient approach to optimizing our acquisition function and show that it outperforms existing BOFN methods and other benchmarks across several synthetic and real-world problems. Our acquisition function is the first to enable cost-aware optimization of a broad class of function networks.
翻译:贝叶斯优化是一种针对评估代价高昂或耗时函数的强大优化框架。近期研究开始关注函数网络的贝叶斯优化(BOFN),其目标函数由函数网络构成,其中每个函数既接收网络前序节点的输出作为输入,也接收额外参数。已有研究表明,利用这种网络结构能带来显著的性能提升。现有通用网络BOFN算法在每次迭代时都需要评估完整网络。然而,许多实际应用场景允许对节点进行单独评估。为利用这一特性,我们提出了一种新颖的知识梯度采集函数,该函数以成本感知的方式选择待评估的节点及相应输入参数,从而通过每一步仅评估网络局部来降低查询成本。我们提出了一种优化该采集函数的高效方法,并在多个合成问题与现实问题中证明其性能优于现有BOFN方法及其他基准方法。本研究所提出的采集函数首次实现了对广泛函数网络类别进行成本感知优化的能力。