The elastic net penalty is frequently employed in high-dimensional statistics for parameter regression and variable selection. It is particularly beneficial compared to lasso when the number of predictors greatly surpasses the number of observations. However, empirical evidence has shown that the $\ell_q$-norm penalty (where $0 < q < 1$) often provides better regression compared to the $\ell_1$-norm penalty, demonstrating enhanced robustness in various scenarios. In this paper, we explore a generalized elastic net model that employs a $\ell_r$-norm (where $r \geq 1$) in loss function to accommodate various types of noise, and employs a $\ell_q$-norm (where $0 < q < 1$) to replace the $\ell_1$-norm in elastic net penalty. Theoretically, we establish the computable lower bounds for the nonzero entries of the generalized first-order stationary points of the proposed generalized elastic net model. For implementation, we develop two efficient algorithms based on the locally Lipschitz continuous $\epsilon$-approximation to $\ell_q$-norm. The first algorithm employs an alternating direction method of multipliers (ADMM), while the second utilizes a proximal majorization-minimization method (PMM), where the subproblems are addressed using the semismooth Newton method (SNN). We also perform extensive numerical experiments with both simulated and real data, showing that both algorithms demonstrate superior performance. Notably, the PMM-SSN is efficient than ADMM, even though the latter provides a simpler implementation.
翻译:弹性网络惩罚项在高维统计中常用于参数回归与变量选择。当预测变量数量远超过观测样本数量时,其相较于lasso方法展现出显著优势。然而,实证研究表明,采用$\ell_q$范数惩罚(其中$0 < q < 1$)通常能比$\ell_1$范数惩罚获得更优的回归效果,并在多种场景下表现出更强的鲁棒性。本文研究一种广义弹性网络模型,该模型在损失函数中采用$\ell_r$范数(其中$r \geq 1$)以适应不同类型的噪声,并以$\ell_q$范数(其中$0 < q < 1$)替代弹性网络惩罚中的$\ell_1$范数。在理论层面,我们建立了该广义弹性网络模型广义一阶稳定点非零分量的可计算下界。在算法实现方面,我们基于$\ell_q$范数的局部Lipschitz连续$\epsilon$逼近,提出了两种高效算法:第一种采用交替方向乘子法(ADMM),第二种采用邻近主优化-最小化方法(PMM),其中子问题通过半光滑牛顿法(SNN)求解。我们通过大量模拟数据与真实数据的数值实验表明,两种算法均具有优越性能。值得注意的是,尽管ADMM实现更为简洁,但PMM-SNN在实际计算中展现出更高的效率。