Subgraph federated learning (subgraph-FL) is a new distributed paradigm that facilitates the collaborative training of graph neural networks (GNNs) by multi-client subgraphs. Unfortunately, a significant challenge of subgraph-FL arises from subgraph heterogeneity, which stems from node and topology variation, causing the impaired performance of the global GNN. Despite various studies, they have not yet thoroughly investigated the impact mechanism of subgraph heterogeneity. To this end, we decouple node and topology variation, revealing that they correspond to differences in label distribution and structure homophily. Remarkably, these variations lead to significant differences in the class-wise knowledge reliability of multiple local GNNs, misguiding the model aggregation with varying degrees. Building on this insight, we propose topology-aware data-free knowledge distillation technology (FedTAD), enhancing reliable knowledge transfer from the local model to the global model. Extensive experiments on six public datasets consistently demonstrate the superiority of FedTAD over state-of-the-art baselines.
翻译:子图联邦学习(subgraph-FL)是一种新的分布式范式,它通过多客户端子图促进图神经网络(GNN)的协同训练。然而,子图联邦学习面临的一个重大挑战源于由节点与拓扑变化导致的子图异质性,这会损害全局GNN的性能。尽管已有多种研究,但它们尚未深入探究子图异质性的影响机制。为此,我们解耦了节点与拓扑变化,揭示出它们分别对应标签分布差异与结构同质性差异。值得注意的是,这些变化会导致多个本地GNN在类别级知识可靠性上出现显著差异,从而以不同程度误导模型聚合。基于这一发现,我们提出了拓扑感知无数据知识蒸馏技术(FedTAD),增强了从本地模型到全局模型的可靠知识迁移。在六个公开数据集上的大量实验一致证明了FedTAD相较于最先进基线方法的优越性。