In this paper we consider a nonconvex unconstrained optimization problem minimizing a twice differentiable objective function with H\"older continuous Hessian. Specifically, we first propose a Newton-conjugate gradient (Newton-CG) method for finding an approximate first- and second-order stationary point of this problem, assuming the associated the H\"older parameters are explicitly known. Then we develop a parameter-free Newton-CG method without requiring any prior knowledge of these parameters. To the best of our knowledge, this method is the first parameter-free second-order method achieving the best-known iteration and operation complexity for finding an approximate first- and second-order stationary point of this problem. Finally, we present preliminary numerical results to demonstrate the superior practical performance of our parameter-free Newton-CG method over a well-known regularized Newton method.
翻译:本文研究一类非凸无约束优化问题,其目标函数二阶可微且具有Hölder连续Hessian矩阵。具体而言,我们首先提出一种牛顿共轭梯度(Newton-CG)方法,在已知相关Hölder参数的条件下,用于寻找该问题的近似一阶和二阶稳定点。随后,我们发展了一种无需预先知晓这些参数的免参数Newton-CG方法。据我们所知,该方法是首个免参数二阶方法,在寻找该问题的近似一阶和二阶稳定点时达到了当前已知最优的迭代复杂度和计算复杂度。最后,我们通过初步数值实验证明,相较于经典的正则化牛顿方法,我们所提出的免参数Newton-CG方法具有更优越的实际计算性能。