Subgraph federated learning (subgraph-FL) is a new distributed paradigm that facilitates the collaborative training of graph neural networks (GNNs) by multi-client subgraphs. Unfortunately, a significant challenge of subgraph-FL arises from subgraph heterogeneity, which stems from node and topology variation, causing the impaired performance of the global GNN. Despite various studies, they have not yet thoroughly investigated the impact mechanism of subgraph heterogeneity. To this end, we decouple node and topology variation, revealing that they correspond to differences in label distribution and structure homophily. Remarkably, these variations lead to significant differences in the class-wise knowledge reliability of multiple local GNNs, misguiding the model aggregation with varying degrees. Building on this insight, we propose topology-aware data-free knowledge distillation technology (FedTAD), enhancing reliable knowledge transfer from the local model to the global model. Extensive experiments on six public datasets consistently demonstrate the superiority of FedTAD over state-of-the-art baselines.
翻译:子图联邦学习(subgraph-FL)是一种新的分布式范式,通过多客户端子图促进图神经网络(GNN)的协同训练。然而,子图联邦学习面临的一个重大挑战源于子图异质性——由节点与拓扑结构差异引发,导致全局GNN性能受损。现有研究虽已开展相关工作,但尚未深入探究子图异质性的影响机制。为此,我们解耦节点差异与拓扑差异,揭示其分别对应标签分布差异与结构同质性差异。值得注意的是,这些差异会导致多个本地GNN的类别级知识可靠性产生显著差异,从而以不同程度误导模型聚合。基于此发现,我们提出拓扑感知无数据知识蒸馏技术(FedTAD),增强从本地模型到全局模型的可靠知识迁移。在六个公开数据集上的大量实验一致证明,FedTAD的性能优于当前最先进的基线方法。