Recent advances (Sherman, 2017; Sidford and Tian, 2018; Cohen et al., 2021) have overcome the fundamental barrier of dimension dependence in the iteration complexity of solving $\ell_\infty$ regression with first-order methods. Yet it remains unclear to what extent such acceleration can be achieved for general $\ell_p$ smooth functions. In this paper, we propose a new accelerated first-order method for convex optimization under non-Euclidean smoothness assumptions. In contrast to standard acceleration techniques, our approach uses primal-dual iterate sequences taken with respect to $\textit{differing}$ norms, which are then coupled using an $\textit{implicitly}$ determined interpolation parameter. For $\ell_p$ norm smooth problems in $d$ dimensions, our method provides an iteration complexity improvement of up to $O(d^{1-\frac{2}{p}})$ in terms of calls to a first-order oracle, thereby allowing us to circumvent long-standing barriers in accelerated non-Euclidean steepest descent.
翻译:近期研究进展(Sherman, 2017; Sidford and Tian, 2018; Cohen et al., 2021)已突破一阶方法求解$\ell_\infty$回归问题时迭代复杂度对维度依赖性的根本障碍。然而,对于一般$\ell_p$光滑函数能在多大程度上实现此类加速仍不明确。本文提出一种新的加速一阶方法,用于非欧几里得光滑性假设下的凸优化问题。与标准加速技术不同,我们的方法采用基于$\textit{不同}$范数的原始-对偶迭代序列,并通过$\textit{隐式}$确定的插值参数进行耦合。针对$d$维空间中的$\ell_p$范数光滑问题,本方法在一阶预言机调用次数方面实现了高达$O(d^{1-\frac{2}{p}})$的迭代复杂度改进,从而突破了非欧几里得最速下降法中长期存在的加速障碍。