This paper introduces a preconditioned convex splitting algorithm enhanced with line search techniques for nonconvex optimization problems. The algorithm utilizes second-order backward differentiation formulas (BDF) for the implicit and linear components and the Adams-Bashforth scheme for the nonlinear and explicit parts of the gradient flow in variational functions. The proposed algorithm, resembling a generalized difference-of-convex-function approach, involves a changing set of convex functions in each iteration. It integrates the Armijo line search strategy to improve performance. The study also discusses classical preconditioners such as symmetric Gauss-Seidel, Jacobi, and Richardson within this context. The global convergence of the algorithm is established through the Kurdyka-{\L}ojasiewicz properties, ensuring convergence within a finite number of preconditioned iterations. Numerical experiments demonstrate the superiority of the proposed second-order convex splitting with line search over conventional difference-of-convex-function algorithms.
翻译:本文针对非凸优化问题,提出了一种结合线搜索技术的预处理凸分裂算法。该算法对变分函数梯度流中的隐式与线性分量采用二阶向后差分公式,对非线性与显式分量采用Adams-Bashforth格式。所提出的算法类似于广义的凸函数差分方法,在每次迭代中涉及一组变化的凸函数,并整合了Armijo线搜索策略以提升性能。研究还在此框架下讨论了对称高斯-赛德尔、雅可比和理查德森等经典预处理方法。通过Kurdyka-{\L}ojasiewicz性质证明了算法的全局收敛性,确保在有限次预处理迭代内收敛。数值实验表明,所提出的带线搜索的二阶凸分裂算法优于传统的凸函数差分算法。