In this paper, we focus on simple bilevel optimization problems, where we minimize a convex smooth objective function over the optimal solution set of another convex smooth constrained optimization problem. We present a novel bilevel optimization method that locally approximates the solution set of the lower-level problem using a cutting plane approach and employs an accelerated gradient-based update to reduce the upper-level objective function over the approximated solution set. We measure the performance of our method in terms of suboptimality and infeasibility errors and provide non-asymptotic convergence guarantees for both error criteria. Specifically, when the feasible set is compact, we show that our method requires at most $\mathcal{O}(\max\{1/\sqrt{\epsilon_{f}}, 1/\epsilon_g\})$ iterations to find a solution that is $\epsilon_f$-suboptimal and $\epsilon_g$-infeasible. Moreover, under the additional assumption that the lower-level objective satisfies the $r$-th H\"olderian error bound, we show that our method achieves an iteration complexity of $\mathcal{O}(\max\{\epsilon_{f}^{-\frac{2r-1}{2r}},\epsilon_{g}^{-\frac{2r-1}{2r}}\})$, which matches the optimal complexity of single-level convex constrained optimization when $r=1$.
翻译:本文研究简单双层优化问题,其目标是在另一个凸光滑约束优化问题的最优解集上最小化一个凸光滑目标函数。我们提出了一种新颖的双层优化方法,该方法通过割平面方法局部逼近下层问题的解集,并采用加速梯度更新在逼近解集上降低上层目标函数。我们通过次优性和不可行性误差来衡量方法的性能,并为两种误差准则提供了非渐近收敛保证。具体而言,当可行集紧致时,我们证明该方法至多需要 $\mathcal{O}(\max\{1/\sqrt{\epsilon_{f}}, 1/\epsilon_g\})$ 次迭代即可找到 $\epsilon_f$-次优且 $\epsilon_g$-不可行的解。此外,在下层目标函数满足 $r$ 阶 Hölder 误差界的附加假设下,我们证明该方法实现了 $\mathcal{O}(\max\{\epsilon_{f}^{-\frac{2r-1}{2r}},\epsilon_{g}^{-\frac{2r-1}{2r}}\})$ 的迭代复杂度,当 $r=1$ 时,该复杂度与单层凸约束优化的最优复杂度相匹配。