We introduce fast algorithms for solving $\ell_{p}$ regression problems using the iteratively reweighted least squares (IRLS) method. Our approach achieves state-of-the-art iteration complexity, outperforming the IRLS algorithm by Adil-Peng-Sachdeva (NeurIPS 2019) and matching the theoretical bounds established by the complex algorithm of Adil-Kyng-Peng-Sachdeva (SODA 2019, J. ACM 2024) via a simpler lightweight iterative scheme. This bridges the existing gap between theoretical and practical algorithms for $\ell_{p}$ regression. Our algorithms depart from prior approaches, using a primal-dual framework, in which the update rule can be naturally derived from an invariant maintained for the dual objective. Empirically, we show that our algorithms significantly outperform both the IRLS algorithm by Adil-Peng-Sachdeva and MATLAB/CVX implementations.
翻译:本文介绍了利用迭代重加权最小二乘法求解 $\ell_{p}$ 回归问题的快速算法。我们的方法达到了最优的迭代复杂度,其性能超越了 Adil-Peng-Sachdeva(NeurIPS 2019)提出的 IRLS 算法,并且通过一种更简单的轻量级迭代方案,匹配了 Adil-Kyng-Peng-Sachdeva(SODA 2019, J. ACM 2024)复杂算法所建立的理论上界。这弥合了现有 $\ell_{p}$ 回归理论算法与实际算法之间的差距。我们的算法与先前方法不同,采用了对偶框架,其中更新规则可以自然地由为对偶目标保持的不变量推导得出。实验表明,我们的算法在性能上显著优于 Adil-Peng-Sachdeva 的 IRLS 算法以及 MATLAB/CVX 的实现。