The least trimmed squares (LTS) is a reasonable formulation of robust regression whereas it suffers from high computational cost due to the nonconvexity and nonsmoothness of its objective function. The most frequently used FAST-LTS algorithm is particularly slow when a sparsity-inducing penalty such as the $\ell_1$ norm is added. This paper proposes a computationally inexpensive algorithm for the sparse LTS, which is based on the proximal gradient method with a reformulation technique. Proposed method is equipped with theoretical convergence preferred over existing methods. Numerical experiments show that our method efficiently yields small objective value.
翻译:最小截断二乘(LTS)是稳健回归的一种合理建模方法,然而由于其目标函数的非凸性与非光滑性,其计算成本较高。当加入诸如$\ell_1$范数等诱导稀疏性的惩罚项时,最常用的FAST-LTS算法尤其缓慢。本文针对稀疏LTS问题提出了一种计算代价较低的算法,该算法基于近端梯度法并结合了重构技术。所提出的方法在理论收敛性上优于现有方法。数值实验表明,我们的方法能够高效地获得较小的目标函数值。