This paper explores a specific type of nonconvex sparsity-promoting regularization problems, namely those involving $\ell_p$-norm regularization, in conjunction with a twice continuously differentiable loss function. We propose a novel second-order algorithm designed to effectively address this class of challenging nonconvex and nonsmooth problems, showcasing several innovative features: (i) The use of an alternating strategy to solve a reweighted $\ell_1$ regularized subproblem and the subspace approximate Newton step. (ii) The reweighted $\ell_1$ regularized subproblem relies on a convex approximation to the nonconvex regularization term, enabling a closed-form solution characterized by the soft-thresholding operator. This feature allows our method to be applied to various nonconvex regularization problems. (iii) Our algorithm ensures that the iterates maintain their sign values and that nonzero components are kept away from 0 for a sufficient number of iterations, eventually transitioning to a perturbed Newton method. (iv) We provide theoretical guarantees of global convergence, local superlinear convergence in the presence of the Kurdyka-\L ojasiewicz (KL) property, and local quadratic convergence when employing the exact Newton step in our algorithm. We also showcase the effectiveness of our approach through experiments on a diverse set of model prediction problems.
翻译:本文探讨了一类特定的非凸稀疏促进正则化问题,即涉及$\ell_p$范数正则化并结合二次连续可微损失函数的问题。我们提出了一种新颖的二阶算法,旨在有效解决此类具有挑战性的非凸非光滑问题,并展现出多项创新特征:(i) 采用交替策略求解重加权$\ell_1$正则化子问题和子空间近似牛顿步。(ii) 重加权$\ell_1$正则化子问题依赖于对非凸正则化项的凸近似,从而获得以软阈值算子为特征的闭式解。这一特性使得我们的方法能够适用于多种非凸正则化问题。(iii) 我们的算法确保迭代点保持其符号值,且非零分量在足够多的迭代次数内远离0,最终过渡到扰动牛顿法。(iv) 我们提供了全局收敛的理论保证,在满足Kurdyka-\L ojasiewicz (KL) 性质时具有局部超线性收敛性,并在算法中采用精确牛顿步时实现局部二次收敛。我们还通过在多种模型预测问题上的实验展示了所提方法的有效性。