In recent years, there have been significant advances in efficiently solving $\ell_s$-regression using linear system solvers and $\ell_2$-regression [Adil-Kyng-Peng-Sachdeva, J. ACM'24]. Would efficient smoothed $\ell_p$-norm solvers lead to even faster rates for solving $\ell_s$-regression when $2 \leq p < s$? In this paper, we give an affirmative answer to this question and show how to solve $\ell_s$-regression using $\tilde{O}(n^{\fracν{1+ν}})$ iterations of solving smoothed $\ell_p$ regression problems, where $ν:= \frac{1}{p} - \frac{1}{s}$. To obtain this result, we provide improved accelerated rates for convex optimization problems when given access to an $\ell_p^s(λ)$-proximal oracle, which, for a point $c$, returns the solution of the regularized problem $\min_{x} f(x) + λ||x-c||_p^s$. Additionally, we show that these rates for the $\ell_p^s(λ)$-proximal oracle are optimal for algorithms that query in the span of the outputs of the oracle, and we further apply our techniques to settings of high-order and quasi-self-concordant optimization.
翻译:近年来,利用线性系统求解器和 $\ell_2$-回归高效求解 $\ell_s$-回归问题取得了显著进展 [Adil-Kyng-Peng-Sachdeva, J. ACM'24]。当 $2 \leq p < s$ 时,高效的平滑 $\ell_p$-范数求解器能否为求解 $\ell_s$-回归带来更快的收敛速率?本文对此问题给出了肯定回答,并展示了如何使用 $\tilde{O}(n^{\fracν{1+ν}})$ 次求解平滑 $\ell_p$ 回归问题的迭代来求解 $\ell_s$-回归,其中 $ν:= \frac{1}{p} - \frac{1}{s}$。为获得此结果,我们为凸优化问题提供了改进的加速收敛速率,前提是能够访问 $\ell_p^s(λ)$-邻近神谕,该神谕对于给定点 $c$,返回正则化问题 $\min_{x} f(x) + λ||x-c||_p^s$ 的解。此外,我们证明了这些 $\ell_p^s(λ)$-邻近神谕的收敛速率对于在神谕输出张成的空间内进行查询的算法是最优的,并进一步将我们的技术应用于高阶优化和拟自协调优化的场景。