Federated Graph Learning (FGL) aims to collaboratively and privately optimize graph models on divergent data for different tasks. A critical challenge in FGL is to enable effective yet efficient federated optimization against multifaceted graph heterogeneity to enhance mutual performance. However, existing FGL works primarily address graph data heterogeneity and perform incapable of graph task heterogeneity. To address the challenge, we propose a Federated Graph Prompt Learning (FedGPL) framework to efficiently enable prompt-based asymmetric graph knowledge transfer between multifaceted heterogeneous federated participants. Generally, we establish a split federated framework to preserve universal and domain-specific graph knowledge, respectively. Moreover, we develop two algorithms to eliminate task and data heterogeneity for advanced federated knowledge preservation. First, a Hierarchical Directed Transfer Aggregator (HiDTA) delivers cross-task beneficial knowledge that is hierarchically distilled according to the directional transferability. Second, a Virtual Prompt Graph (VPG) adaptively generates graph structures to enhance data utility by distinguishing dominant subgraphs and neutralizing redundant ones. We conduct theoretical analyses and extensive experiments to demonstrate the significant accuracy and efficiency effectiveness of FedGPL against multifaceted graph heterogeneity compared to state-of-the-art baselines on large-scale federated graph datasets.
翻译:联邦图学习(FGL)旨在针对不同任务,在分散的数据上协同且私有地优化图模型。FGL面临的一个关键挑战是,如何针对多层面的图异构性实现有效且高效的联邦优化,以提升各方性能。然而,现有的FGL工作主要解决图数据异构性问题,无法应对图任务异构性。为应对这一挑战,我们提出一个联邦图提示学习(FedGPL)框架,以高效实现多层面异构联邦参与者之间基于提示的非对称图知识迁移。总体而言,我们建立了一个分割联邦框架,分别保存通用图知识和领域特定图知识。此外,我们开发了两种算法来消除任务和数据异构性,以实现高级联邦知识保存。首先,分层定向迁移聚合器(HiDTA)根据定向可迁移性进行分层蒸馏,传递跨任务有益知识。其次,虚拟提示图(VPG)通过区分主导子图并中和冗余子图,自适应生成图结构以增强数据效用。我们进行了理论分析和大量实验,在大规模联邦图数据集上与最先进的基线方法相比,证明了FedGPL在应对多层面图异构性方面具有显著的准确性和效率优势。