This paper proposes FedNMap, a normal map-based method for composite federated learning, where the objective consists of a smooth loss and a possibly nonsmooth regularizer. FedNMap leverages a normal map-based update scheme to handle the nonsmooth term and incorporates a local correction strategy to mitigate the impact of data heterogeneity across clients. Under standard assumptions, including smooth local losses, weak convexity of the regularizer, and bounded stochastic gradient variance, FedNMap achieves linear speedup with respect to both the number of clients $n$ and the number of local updates $Q$ for nonconvex losses, both with and without the Polyak-Łojasiewicz (PL) condition. To our knowledge, this is the first result establishing linear speedup for nonconvex composite federated learning.
翻译:本文提出FedNMap,一种基于正规映射的复合联邦学习方法,其目标函数包含一个光滑损失项和一个可能非光滑的正则化项。FedNMap利用基于正规映射的更新方案处理非光滑项,并引入局部校正策略以减轻客户端间数据异质性的影响。在标准假设下(包括局部损失的光滑性、正则化项的弱凸性以及有界随机梯度方差),对于非凸损失函数,无论是否满足Polyak-Łojasiewicz(PL)条件,FedNMap均能实现关于客户端数量$n$和本地更新次数$Q$的线性加速。据我们所知,这是首个在非凸复合联邦学习中建立线性加速的理论结果。