Multiple local steps are key to communication-efficient federated learning. However, theoretical guarantees for such algorithms, without data heterogeneity-bounding assumptions, have been lacking in general non-smooth convex problems. Leveraging projection-efficient optimization methods, we propose FedMLS, a federated learning algorithm with provable improvements from multiple local steps. FedMLS attains an $\epsilon$-suboptimal solution in $\mathcal{O}(1/\epsilon)$ communication rounds, requiring a total of $\mathcal{O}(1/\epsilon^2)$ stochastic subgradient oracle calls.
翻译:多轮本地更新是实现通信高效联邦学习的关键。然而,在一般非光滑凸问题中,此类算法在不依赖数据异质性约束假设下的理论保证一直缺失。通过利用投影高效优化方法,我们提出了FedMLS,一种具有多轮本地更新且可证明改进的联邦学习算法。FedMLS在$\mathcal{O}(1/\epsilon)$通信轮次内达到$\epsilon$次优解,总共需要$\mathcal{O}(1/\epsilon^2)$次随机次梯度预言机调用。