We propose adaptive, line search-free second-order methods with optimal rate of convergence for solving convex-concave min-max problems. By means of an adaptive step size, our algorithms feature a simple update rule that requires solving only one linear system per iteration, eliminating the need for line search or backtracking mechanisms. Specifically, we base our algorithms on the optimistic method and appropriately combine it with second-order information. Moreover, distinct from common adaptive schemes, we define the step size recursively as a function of the gradient norm and the prediction error in the optimistic update. We first analyze a variant where the step size requires knowledge of the Lipschitz constant of the Hessian. Under the additional assumption of Lipschitz continuous gradients, we further design a parameter-free version by tracking the Hessian Lipschitz constant locally and ensuring the iterates remain bounded. We also evaluate the practical performance of our algorithm by comparing it to existing second-order algorithms for minimax optimization.
翻译:我们提出了无需线搜索的自适应二阶方法,这些方法具有最优收敛速率,用于解决凸凹极小极大问题。通过采用自适应步长,我们的算法具有简单的更新规则,每次迭代仅需求解一个线性系统,从而消除了对线搜索或回溯机制的需求。具体而言,我们的算法基于乐观方法,并将其与二阶信息适当结合。此外,与常见的自适应方案不同,我们将步长递归地定义为梯度范数和乐观更新中预测误差的函数。我们首先分析了一种需要已知Hessian矩阵Lipschitz常数的步长变体。在梯度Lipschitz连续的附加假设下,我们通过局部追踪Hessian Lipschitz常数并确保迭代点保持有界,进一步设计了一个无参数版本。我们还通过将我们的算法与现有的极小极大优化二阶算法进行比较,评估了其实际性能。