Federated learning (FL), as an emerging collaborative learning paradigm, has garnered significant attention due to its capacity to preserve privacy within distributed learning systems. In these systems, clients collaboratively train a unified neural network model using their local datasets and share model parameters rather than raw data, enhancing privacy. Predominantly, FL systems are designed for mobile and edge computing environments where training typically occurs over wireless networks. Consequently, as model sizes increase, the conventional FL frameworks increasingly consume substantial communication resources. To address this challenge and improve communication efficiency, this paper introduces a novel hierarchical FL framework that integrates the benefits of clustered FL and model compression. We present an adaptive clustering algorithm that identifies a core client and dynamically organizes clients into clusters. Furthermore, to enhance transmission efficiency, each core client implements a local aggregation with compression (LC aggregation) algorithm after collecting compressed models from other clients within the same cluster. Simulation results affirm that our proposed algorithms not only maintain comparable predictive accuracy but also significantly reduce energy consumption relative to existing FL mechanisms.
翻译:联邦学习作为一种新兴的协同学习范式,因其在分布式学习系统中保护隐私的能力而受到广泛关注。在该系统中,客户端利用本地数据集协同训练统一的神经网络模型,并通过共享模型参数而非原始数据来增强隐私保护。目前,联邦学习系统主要面向移动与边缘计算环境设计,其训练过程通常通过无线网络进行。因此,随着模型规模的增长,传统联邦学习框架对通信资源的消耗日益显著。为应对这一挑战并提升通信效率,本文提出了一种融合集群联邦学习与模型压缩优势的新型分层联邦学习框架。我们设计了一种自适应聚类算法,该算法能够识别核心客户端并动态地将客户端组织成集群。此外,为提升传输效率,各核心客户端在收集同集群内其他客户端的压缩模型后,会执行一种结合压缩的本地聚合算法。仿真结果表明,相较于现有联邦学习机制,我们提出的算法在保持相当预测精度的同时,能够显著降低系统能耗。