We show that adaptive proximal gradient methods for convex problems are not restricted to traditional Lipschitzian assumptions. Our analysis reveals that a class of linesearch-free methods is still convergent under mere local H\"older gradient continuity, covering in particular continuously differentiable semi-algebraic functions. To mitigate the lack of local Lipschitz continuity, popular approaches revolve around $\varepsilon$-oracles and/or linesearch procedures. In contrast, we exploit plain H\"older inequalities not entailing any approximation, all while retaining the linesearch-free nature of adaptive schemes. Furthermore, we prove full sequence convergence without prior knowledge of local H\"older constants nor of the order of H\"older continuity. Numerical experiments make comparisons with baseline methods on diverse tasks from machine learning covering both the locally and the globally H\"older setting.
翻译:本文证明,针对凸优化问题的自适应近端梯度方法并不受传统Lipschitz假设的限制。分析表明,在仅满足局部Hölder梯度连续性的条件下,一类无需线搜索的方法仍然保持收敛性,该条件特别涵盖了连续可微的半代数函数。为应对局部Lipschitz连续性的缺失,现有主流方法主要依赖于$\varepsilon$-预言机和/或线搜索流程。与之相对,本文直接利用纯粹的Hölder不等式——该方法不涉及任何近似处理,同时完整保留了自适应方案无需线搜索的特性。此外,我们证明了在无需预先获知局部Hölder常数或Hölder连续性阶数的情况下,算法仍能保证全序列收敛。数值实验在涵盖局部与全局Hölder场景的多种机器学习任务中,与基准方法进行了对比验证。