We propose a new accelerated first-order method for convex optimization under non-Euclidean smoothness assumptions. In contrast to standard acceleration techniques, our approach uses primal-dual iterate sequences taken with respect to differing norms, which are then coupled using an implicitly determined interpolation parameter. For $\ell_p$ norm smooth problems in $d$ dimensions, our method provides an iteration complexity improvement of up to $O(d^{1-\frac{2}{p}})$ in terms of calls to a first-order oracle, thereby allowing us to circumvent long-standing barriers in accelerated non-Euclidean steepest descent.
翻译:本文提出了一种新的加速一阶方法,用于非欧几里得光滑性假设下的凸优化问题。与标准加速技术不同,我们的方法采用对偶迭代序列,这些序列在不同范数下生成,并通过隐式确定的插值参数进行耦合。对于$d$维空间中的$\ell_p$范数光滑问题,本方法在一阶预言机调用次数上实现了高达$O(d^{1-\frac{2}{p}})$的迭代复杂度改进,从而突破了非欧几里得最速下降法中长期存在的加速瓶颈。