The analysis of the acceleration behavior of gradient-based eigensolvers with preconditioning presents a substantial theoretical challenge. In this work, we present a novel framework for preconditioning on Riemannian manifolds and introduce a metric, the leading angle, to evaluate preconditioners for symmetric eigenvalue problems. We extend the locally optimal Riemannian accelerated gradient method for Riemannian convex optimization to develop the Riemannian Acceleration with Preconditioning (RAP) method for symmetric eigenvalue problems, thereby providing theoretical evidence to support its acceleration. Our analysis of the Schwarz preconditioner for elliptic eigenvalue problems demonstrates that RAP achieves a convergence rate of $1-C\kappa^{-1/2}$, which is an improvement over the preconditioned steepest descent method's rate of $1-C\kappa^{-1}$. The exponent in $\kappa^{-1/2}$ is sharp, and numerical experiments confirm our theoretical findings.
翻译:分析带有预条件的基于梯度的特征值求解器的加速行为是一个重大的理论挑战。在本工作中,我们提出了一个黎曼流形上的预条件新框架,并引入了一种度量——主导角,用以评估对称特征值问题的预条件子。我们将黎曼凸优化的局部最优黎曼加速梯度法进行扩展,为对称特征值问题开发了黎曼预条件加速法,从而为其加速提供了理论依据。我们对椭圆特征值问题的施瓦茨预条件子进行的分析表明,RAP 方法实现了 $1-C\kappa^{-1/2}$ 的收敛速率,优于预条件最速下降法的 $1-C\kappa^{-1}$ 收敛速率。$\kappa^{-1/2}$ 中的指数是尖锐的,数值实验也证实了我们的理论结果。