The choice of the tuning parameter in the Lasso is central to its statistical performance in high-dimensional linear regression. Classical consistency theory characterizes the rate of the Lasso tuning parameter, and numerous works provide non-asymptotic guarantees. However, the optimal choice of tuning within a fully non-asymptotic framework remains incompletely understood. In this work, we study tuning regimes under which the Lasso exhibits suboptimal prediction performance, in the sense that it is dominated in mean squared prediction error by a simple refinement. We further examine how structural factors in the design matrix influence the suboptimality phenomenon, and discuss extensions to other estimators and more general noise structures.
翻译:暂无翻译