We analyze the Accelerated Noisy Power Method, an algorithm for Principal Component Analysis in the setting where only inexact matrix-vector products are available, which can arise for instance in decentralized PCA. While previous works have established that acceleration can improve convergence rates compared to the standard Noisy Power Method, these guarantees require overly restrictive upper bounds on the magnitude of the perturbations, limiting their practical applicability. We provide an improved analysis of this algorithm, which preserves the accelerated convergence rate under much milder conditions on the perturbations. We show that our new analysis is worst-case optimal, in the sense that the convergence rate cannot be improved, and that the noise conditions we derive cannot be relaxed without sacrificing convergence guarantees. We demonstrate the practical relevance of our results by deriving an accelerated algorithm for decentralized PCA, which has similar communication costs to non-accelerated methods. To our knowledge, this is the first decentralized algorithm for PCA with provably accelerated convergence.
翻译:我们分析了加速噪声幂方法,这是一种在仅能获得不精确矩阵-向量乘积场景下的主成分分析算法,该场景可能出现在去中心化PCA等应用中。尽管先前研究已证实加速技术相比标准噪声幂方法能够改善收敛速率,但这些理论保证要求对扰动幅度施加过于严格的限制,从而制约了其实际适用性。本文对该算法提出了改进分析,能够在更宽松的扰动条件下保持加速收敛速率。我们证明新分析具有最坏情况最优性:收敛速率不可进一步提升,且所推导的噪声条件在保持收敛保证的前提下不可再放宽。通过推导去中心化PCA的加速算法,我们展示了研究结果的实际相关性,该算法的通信成本与非加速方法相当。据我们所知,这是首个具有可证明加速收敛特性的去中心化PCA算法。