We tackle a major challenge in federated learning (FL) -- achieving good performance under highly heterogeneous client distributions. The difficulty partially arises from two seemingly contradictory goals: learning a common model by aggregating the information from clients, and learning local personalized models that should be adapted to each local distribution. In this work, we propose Solution Simplex Clustered Federated Learning (SosicFL) for dissolving such contradiction. Based on the recent ideas of learning solution simplices, SosicFL assigns a subregion in a simplex to each client, and performs FL to learn a common solution simplex. This allows the client models to possess their characteristics within the degrees of freedom in the solution simplex, and at the same time achieves the goal of learning a global common model. Our experiments show that SosicFL improves the performance and accelerates the training process for global and personalized FL with minimal computational overhead.
翻译:我们解决了联邦学习(FL)中的一个主要挑战——在高度异构的客户端分布下实现良好性能。这一困难部分源于两个看似矛盾的目标:通过聚合来自客户端的信息学习通用模型,以及学习适应各局部分布的个性化局部模型。在本工作中,我们提出解单纯形聚类联邦学习(Solution Simplex Clustered Federated Learning,SosicFL)以消除这一矛盾。基于最近提出的解单纯形学习理念,SosicFL为每个客户端分配单纯形中的一个子区域,并通过联邦学习学习一个共同的解单纯形。这使得客户端模型能够在解单纯形的自由度范围内保持自身特性,同时实现学习全局通用模型的目标。实验表明,SosicFL在极小的计算开销下提升了全局和个性化联邦学习的性能并加速了训练过程。