The convergence rate of various first-order optimization algorithms is a pivotal concern within the numerical optimization community, as it directly reflects the efficiency of these algorithms across different optimization problems. Our goal is making a significant step forward in the formal mathematical representation of optimization techniques using the Lean4 theorem prover. We first formalize the gradient for smooth functions and the subgradient for convex functions on a Hilbert space, laying the groundwork for the accurate formalization of algorithmic structures. Then, we extend our contribution by proving several properties of differentiable convex functions that have not yet been formalized in Mathlib. Finally, a comprehensive formalization of these algorithms is presented. These developments are not only noteworthy on their own but also serve as essential precursors to the formalization of a broader spectrum of numerical algorithms and their applications in machine learning as well as many other areas.
翻译:各类一阶优化算法的收敛速度是数值优化领域的一个核心问题,因为它直接反映了这些算法在不同优化问题上的效率。我们的目标是在使用Lean4定理证明器对优化技术进行形式化数学表示方面迈出重要一步。我们首先在希尔伯特空间上形式化了光滑函数的梯度和凸函数的次梯度,为算法结构的精确形式化奠定了基础。随后,我们通过证明Mathlib中尚未形式化的若干可微凸函数性质,进一步扩展了贡献。最后,我们给出了这些算法的完整形式化。这些进展不仅本身具有重要意义,而且为更广泛的数值算法及其在机器学习等诸多领域应用的形式化提供了必要的前驱工作。