Leveraging on recent advancements on adaptive methods for convex minimization problems, this paper provides a linesearch-free proximal gradient framework for globalizing the convergence of popular stepsize choices such as Barzilai-Borwein and one-dimensional Anderson acceleration. This framework can cope with problems in which the gradient of the differentiable function is merely locally H\"older continuous. Our analysis not only encompasses but also refines existing results upon which it builds. The theory is corroborated by numerical evidence that showcases the synergetic interplay between fast stepsize selections and adaptive methods.
翻译:基于近期自适应方法在凸最小化问题上的进展,本文提出了一种无需线搜索的近端梯度框架,用于将流行的步长选择(如Barzilai-Borwein步长和一维Anderson加速)的收敛性全局化。该框架能够处理可微函数梯度仅为局部Hölder连续的问题。我们的分析不仅涵盖了现有理论,还对其进行了改进。数值实验证实了快速步长选择与自适应方法之间的协同作用,为理论提供了有力支撑。