Federated Learning (FL) enables decentralized model training across clients without sharing raw data, but its performance degrades under real-world data heterogeneity. Existing methods often fail to address distribution shift across clients and distribution drift over time, or they rely on unrealistic assumptions such as known number of client clusters and data heterogeneity types, which limits their generalizability. We introduce Feroma, a novel FL framework that explicitly handles both distribution shift and drift without relying on client or cluster identity. Feroma builds on client distribution profiles-compact, privacy-preserving representations of local data-that guide model aggregation and test-time model assignment through adaptive similarity-based weighting. This design allows Feroma to dynamically select aggregation strategies during training, ranging from clustered to personalized, and deploy suitable models to unseen, and unlabeled test clients without retraining, online adaptation, or prior knowledge on clients' data. Extensive experiments show that compared to 10 state-of-the-art methods, Feroma improves performance and stability under dynamic data heterogeneity conditions-an average accuracy gain of up to 12 percentage points over the best baselines across 6 benchmarks-while maintaining computational and communication overhead comparable to FedAvg. These results highlight that distribution-profile-based aggregation offers a practical path toward robust FL under both data distribution shifts and drifts.
翻译:联邦学习(FL)使得模型能够在客户端之间进行去中心化训练而无需共享原始数据,但在现实世界的数据异构性下其性能会下降。现有方法通常无法解决客户端间的分布偏移和随时间变化的分布漂移,或者依赖于不切实际的假设,例如已知的客户端聚类数量和数据异构类型,这限制了它们的泛化能力。我们提出了Feroma,一种新颖的FL框架,它明确处理分布偏移和漂移,且不依赖于客户端或聚类的身份信息。Feroma建立在客户端分布剖面的基础上——这些是本地数据的紧凑、保护隐私的表示——通过基于自适应相似性的加权来指导模型聚合和测试时模型分配。这种设计使得Feroma能够在训练期间动态选择聚合策略(从聚类式到个性化式),并在无需重新训练、在线适应或关于客户端数据的先验知识的情况下,将合适的模型部署到未见过的、未标记的测试客户端。大量实验表明,与10种最先进的方法相比,Feroma在动态数据异构条件下提高了性能和稳定性——在6个基准测试中,相对于最佳基线方法平均准确率提升高达12个百分点——同时保持了与FedAvg相当的计算和通信开销。这些结果表明,基于分布剖面的聚合为在数据分布偏移和漂移下实现鲁棒的联邦学习提供了一条实用路径。