Graph propagation (GP) computation plays a crucial role in graph data analysis, supporting various applications such as graph node similarity queries, graph node ranking, graph clustering, and graph neural networks. Existing methods, mainly relying on power iteration or push computation frameworks, often face challenges with slow convergence rates when applied to large-scale graphs. To address this issue, we propose a novel and powerful approach that accelerates power iteration and push methods using Chebyshev polynomials. Specifically, we first present a novel Chebyshev expansion formula for general GP functions, offering a new perspective on GP computation and achieving accelerated convergence. Building on these theoretical insights, we develop a novel Chebyshev power iteration method (\ltwocheb) and a novel Chebyshev push method (\chebpush). Our \ltwocheb method demonstrates an approximate acceleration of $O(\sqrt{N})$ compared to existing power iteration techniques for both personalized PageRank and heat kernel PageRank computations, which are well-studied GP problems. For \chebpush, we propose an innovative subset Chebyshev recurrence technique, enabling the design of a push-style local algorithm with provable error guarantee and reduced time complexity compared to existing push methods. We conduct extensive experiments using 5 large real-world datasets to evaluate our proposed algorithms, demonstrating their superior efficiency compared to state-of-the-art approaches.
翻译:图传播(GP)计算在图数据分析中起着至关重要的作用,支撑着图节点相似性查询、图节点排序、图聚类以及图神经网络等多种应用。现有方法主要依赖于幂迭代或推送计算框架,当应用于大规模图时,常面临收敛速度缓慢的挑战。为解决此问题,我们提出了一种新颖且强大的方法,利用切比雪夫多项式加速幂迭代和推送方法。具体而言,我们首先为通用GP函数提出了一种新颖的切比雪夫展开公式,为GP计算提供了新的视角并实现了加速收敛。基于这些理论见解,我们开发了一种新颖的切比雪夫幂迭代方法(\ltwocheb)和一种新颖的切比雪夫推送方法(\chebpush)。我们的\ltwocheb方法在个性化PageRank和热核PageRank(这两个是经过深入研究的GP问题)的计算上,相较于现有幂迭代技术实现了约$O(\sqrt{N})$的加速。对于\chebpush,我们提出了一种创新的子集切比雪夫递推技术,从而能够设计出一种具有可证明误差保证且时间复杂度低于现有推送方法的推送式局部算法。我们使用5个大型真实世界数据集进行了广泛的实验,以评估我们提出的算法,结果证明了其相较于最先进方法的卓越效率。