Federated Graph Learning (FGL) is tasked with training machine learning models, such as Graph Neural Networks (GNNs), for multiple clients, each with its own graph data. Existing methods usually assume that each client has both node features and graph structure of its graph data. In real-world scenarios, however, there exist federated systems where only a part of the clients have such data while other clients (i.e. graphless clients) may only have node features. This naturally leads to a novel problem in FGL: how to jointly train a model over distributed graph data with graphless clients? In this paper, we propose a novel framework FedGLS to tackle the problem in FGL with graphless clients. In FedGLS, we devise a local graph learner on each graphless client which learns the local graph structure with the structure knowledge transferred from other clients. To enable structure knowledge transfer, we design a GNN model and a feature encoder on each client. During local training, the feature encoder retains the local graph structure knowledge together with the GNN model via knowledge distillation, and the structure knowledge is transferred among clients in global update. Our extensive experiments demonstrate the superiority of the proposed FedGLS over five baselines.
翻译:联邦图学习(FGL)旨在为多个客户端训练机器学习模型,例如图神经网络(GNNs),每个客户端拥有各自的图数据。现有方法通常假设每个客户端同时拥有其图数据的节点特征和图结构。然而,在实际场景中,存在一些联邦系统,其中仅部分客户端拥有此类数据,而其他客户端(即无图客户端)可能仅拥有节点特征。这自然引出了FGL中的一个新问题:如何在包含无图客户端的分布式图数据上联合训练模型?本文提出了一种新颖框架FedGLS来解决联邦图学习中无图客户端的问题。在FedGLS中,我们在每个无图客户端上设计了一个局部图学习器,该学习器通过从其他客户端传递的结构知识来学习局部图结构。为了实现结构知识传递,我们在每个客户端上设计了一个GNN模型和一个特征编码器。在局部训练过程中,特征编码器通过知识蒸馏与GNN模型共同保留局部图结构知识,并在全局更新时在客户端之间传递结构知识。我们的大量实验证明了所提出的FedGLS相对于五种基线方法的优越性。