Federated Learning (FL) enables distributed training across multiple clients without centralized data sharing, while Graph Neural Networks (GNNs) model relational data through message passing. In federated GNN settings, client graphs often exhibit heterogeneous structural and propagation characteristics. When standard aggregation mechanisms are applied to such heterogeneous updates, the global model may converge numerically while exhibiting degraded relational behavior.Our work identifies a geometric failure mode of global aggregation in Cross- Domain Federated GNNs. Although GNN parameters are numerically represented as vectors, they encode relational transformations that govern the direction, strength, and sensitivity of information flow across graph neighborhoods. Aggregating updates originating from incompatible propagation regimes can therefore introduce destructive interference in this transformation space.This leads to loss of coherence in global message passing. Importantly, this degradation is not necessarily reflected in conventional metrics such as loss or accuracy.To address this issue, we propose GGRS (Global Geometric Reference Structure), a server-side framework that regulates client updates prior to aggregation based on geometric admissibility criteria. GGRS preserves directional consistency of relational transformations as well as maintains diversity of admissible propagation subspaces. It also stabilizes sensitivity to neighborhood interactions, without accessing client data or graph topology. Experiments on heterogeneous GNN-native, Amazon Co-purchase datasets demonstrate that GGRS preserves global message-passing coherence across training rounds by highlighting the necessity of geometry-aware regulation in federated graph learning.
翻译:联邦学习(FL)使得无需集中式数据共享即可在多个客户端间进行分布式训练,而图神经网络(GNN)则通过消息传递对关系数据进行建模。在联邦GNN场景中,客户端图通常表现出异构的结构与传播特性。当标准的聚合机制应用于此类异构更新时,全局模型可能在数值上收敛,但其关系行为却出现退化。本文识别了跨域联邦GNN中全局聚合的一种几何失效模式。尽管GNN参数在数值上表示为向量,它们编码了支配图邻域间信息流动方向、强度与敏感度的关系变换。因此,聚合源自不相容传播机制的更新可能会在该变换空间中引入破坏性干扰,从而导致全局消息传递失去一致性。重要的是,这种退化未必会在损失或准确率等传统指标中反映出来。为解决此问题,我们提出了GGRS(全局几何参考结构),一种基于几何容许性准则在聚合前对客户端更新进行调控的服务端框架。GGRS保持了关系变换的方向一致性,并维护了容许传播子空间的多样性,同时在不访问客户端数据或图拓扑的情况下,稳定了其对邻域交互的敏感性。在异构的GNN原生数据集及亚马逊共购数据集上的实验表明,GGRS通过强调联邦图学习中几何感知调控的必要性,在训练轮次间保持了全局消息传递的一致性。