We study constrained comonotone min-max optimization, a structured class of nonconvex-nonconcave min-max optimization problems, and their generalization to comonotone inclusion. In our first contribution, we extend the Extra Anchored Gradient (EAG) algorithm, originally proposed by Yoon and Ryu (2021) for unconstrained min-max optimization, to constrained comonotone min-max optimization and comonotone inclusion, achieving an optimal convergence rate of $O\left(\frac{1}{T}\right)$ among all first-order methods. Additionally, we prove that the algorithm's iterations converge to a point in the solution set. In our second contribution, we extend the Fast Extra Gradient (FEG) algorithm, as developed by Lee and Kim (2021), to constrained comonotone min-max optimization and comonotone inclusion, achieving the same $O\left(\frac{1}{T}\right)$ convergence rate. This rate is applicable to the broadest set of comonotone inclusion problems yet studied in the literature. Our analyses are based on simple potential function arguments, which might be useful for analyzing other accelerated algorithms.
翻译:我们研究约束共单调极小极大优化,这是一类结构化的非凸非凹极小极大优化问题,及其在共单调包含问题上的推广。作为第一项贡献,我们将 Yoon 和 Ryu (2021) 最初针对无约束极小极大优化提出的 Extra Anchored Gradient (EAG) 算法,推广到约束共单调极小极大优化和共单调包含问题,在所有一阶方法中达到了 $O\left(\frac{1}{T}\right)$ 的最优收敛速率。此外,我们证明了该算法的迭代会收敛到解集中的一个点。作为第二项贡献,我们将 Lee 和 Kim (2021) 提出的 Fast Extra Gradient (FEG) 算法推广到约束共单调极小极大优化和共单调包含问题,同样达到了 $O\left(\frac{1}{T}\right)$ 的收敛速率。该速率适用于文献中迄今研究的最广泛的一类共单调包含问题。我们的分析基于简单的势函数论证,这可能对分析其他加速算法有所助益。