We address the challenge of federated learning on graph-structured data distributed across multiple clients. Specifically, we focus on the prevalent scenario of interconnected subgraphs, where interconnections between different clients play a critical role. We present a novel framework for this scenario, named FedStruct, that harnesses deep structural dependencies. To uphold privacy, unlike existing methods, FedStruct eliminates the necessity of sharing or generating sensitive node features or embeddings among clients. Instead, it leverages explicit global graph structure information to capture inter-node dependencies. We validate the effectiveness of FedStruct through experimental results conducted on six datasets for semi-supervised node classification, showcasing performance close to the centralized approach across various scenarios, including different data partitioning methods, varying levels of label availability, and number of clients.
翻译:本文针对图结构数据在多个客户端间分布式存储的联邦学习挑战展开研究。具体而言,我们聚焦于普遍存在的互连子图场景,其中不同客户端间的连接关系具有关键作用。我们提出了一种适用于该场景的新型框架FedStruct,该框架能够有效利用深层结构依赖性。为保障隐私安全,与现有方法不同,FedStruct无需在客户端之间共享或生成敏感节点特征或嵌入表示,而是通过显式的全局图结构信息来捕获节点间的依赖关系。我们在六个半监督节点分类数据集上进行了实验验证,结果表明FedStruct在不同数据划分方式、标签可用性水平及客户端数量的多种场景下,其性能均接近集中式学习方法。