This paper introduces a preconditioned convex splitting algorithm enhanced with line search techniques for nonconvex optimization problems. The algorithm utilizes second-order backward differentiation formulas (BDF) for the implicit and linear components and the Adams-Bashforth scheme for the nonlinear and explicit parts of the gradient flow in variational functions. The proposed algorithm, resembling a generalized difference-of-convex-function approach, involves a changing set of convex functions in each iteration. It integrates the Armijo line search strategy to improve performance. The study also discusses classical preconditioners such as symmetric Gauss-Seidel, Jacobi, and Richardson within this context. The global convergence of the algorithm is established through the Kurdyka-{\L}ojasiewicz properties, ensuring convergence within a finite number of preconditioned iterations. Numerical experiments demonstrate the superiority of the proposed second-order convex splitting with line search over conventional difference-of-convex-function algorithms.
翻译:本文针对非凸优化问题,提出了一种结合线搜索技术的预处理凸分裂算法。该算法对变分函数中梯度流的隐式线性部分采用二阶向后差分公式,对非线性显式部分采用Adams-Bashforth格式。所提出的算法类似于广义的凸函数差值方法,在每次迭代中涉及一组变化的凸函数。该算法集成了Armijo线搜索策略以提升性能。研究还讨论了对称高斯-赛德尔、雅可比和理查森等经典预处理器在此框架下的应用。通过Kurdyka-{\L}ojasiewicz性质建立了算法的全局收敛性,确保了在有限次预处理迭代内的收敛。数值实验表明,所提出的结合线搜索的二阶凸分裂算法优于传统的凸函数差值算法。