We investigate the strong convergence properties of a proximal-gradient inertial algorithm with two Tikhonov regularization terms in connection to the minimization problem of the sum of a convex lower semi-continuous function $f$ and a smooth convex function $g$. For the appropriate setting of the parameters we provide strong convergence of the generated sequence $(x_k)$ to the minimum norm minimizer of our objective function $f+g$. Further, we obtain fast convergence to zero of the objective function values in a generated sequence but also for the discrete velocity and the sub-gradient of the objective function. We also show that for another settings of the parameters the optimal rate of order $\mathcal{O}(k^{-2})$ for the potential energy $(f+g)(x_k)-\min(f+g)$ can be obtained.
翻译:本文研究了一种带有两项Tikhonov正则项的邻近梯度惯性算法在求解凸下半连续函数$f$与光滑凸函数$g$之和的最小化问题时的强收敛性。通过适当设置参数,我们证明了生成序列$(x_k)$强收敛于目标函数$f+g$的最小范数极小点。进一步,我们获得了生成序列中目标函数值、离散速度及目标函数次梯度快速收敛于零的结果。我们还证明,在另一组参数设置下,势能$(f+g)(x_k)-\min(f+g)$可获得$\mathcal{O}(k^{-2})$阶的最优收敛速率。