Proximal splitting-based convex optimization is a promising approach to linear inverse problems because we can use some prior knowledge of the unknown variables explicitly. An understanding of the behavior of the optimization algorithms would be important for the tuning of the parameters and the development of new algorithms. In this paper, we first analyze the asymptotic property of the proximity operator for the squared loss function, which appears in the update equations of some proximal splitting methods for linear inverse problems. Our analysis shows that the output of the proximity operator can be characterized with a scalar random variable in the large system limit. Moreover, we apply the asymptotic result to the prediction of optimization algorithms for compressed sensing. Simulation results demonstrate that the MSE performance of the Douglas-Rachford algorithm can be well predicted in compressed sensing with the $\ell_{1}$ optimization. We also examine the behavior of the prediction for the case with nonconvex smoothly clipped absolute deviation (SCAD) and minimax concave penalty (MCP) regularization.
翻译:基于邻近分裂的凸优化是线性逆问题的一种有前景的解决方法,因为它允许我们显式地利用未知变量的某些先验知识。理解优化算法的行为对于参数调优和新算法的开发具有重要意义。本文首先分析了平方损失函数邻近算子的渐近性质,该算子在求解线性逆问题的某些邻近分裂方法的更新方程中出现。我们的分析表明,在大系统极限下,该邻近算子的输出可以用一个标量随机变量来刻画。此外,我们将该渐近结果应用于压缩感知中优化算法的性能预测。仿真结果表明,在采用 $\ell_{1}$ 优化的压缩感知中,Douglas-Rachford 算法的均方误差性能可以得到很好的预测。我们还检验了该预测方法在非凸光滑剪切绝对偏差(SCAD)和极小极大凹惩罚(MCP)正则化情况下的行为。