In this paper, we propose, analyze and demonstrate a dynamic momentum method to accelerate power and inverse power iterations with minimal computational overhead. The method can be applied to real diagonalizable matrices, is provably convergent with acceleration in the symmetric case, and does not require a priori spectral knowledge. We review and extend background results on previously developed static momentum accelerations for the power iteration through the connection between the momentum accelerated iteration and the standard power iteration applied to an augmented matrix. We show that the augmented matrix is defective for the optimal parameter choice. We then present our dynamic method which updates the momentum parameter at each iteration based on the Rayleigh quotient and two previous residuals. We present convergence and stability theory for the method by considering a power-like method consisting of multiplying an initial vector by a sequence of augmented matrices. We demonstrate the developed method on a number of benchmark problems, and see that it outperforms both the power iteration and often the static momentum acceleration with optimal parameter choice. Finally, we present and demonstrate an explicit extension of the algorithm to inverse power iterations.
翻译:本文提出、分析并展示了一种动态动量方法,以最小计算开销加速幂迭代和反幂迭代。该方法适用于实对角化矩阵,在对称情形下可证明收敛且具有加速效果,且无需先验谱信息。我们通过动量加速迭代与应用于增广矩阵的标准幂迭代之间的关联,回顾并扩展了先前针对幂迭代开发的静态动量加速的背景结果。我们证明了增广矩阵在最优参数选择下是亏损的。随后,我们提出动态方法,该方法基于瑞利商和两个先前残差在每次迭代中更新动量参数。通过构建一种类幂方法——该方法将初始向量与一系列增广矩阵相乘——我们给出了该方法的收敛性与稳定性理论。我们在多个基准问题上验证了所提方法,结果表明其性能优于标准幂迭代,且通常也优于采用最优参数选择的静态动量加速方法。最后,我们提出并展示了该算法向反幂迭代的显式扩展方案。