In this paper, we depart from the widely-used gradient descent-based hierarchical federated learning (FL) algorithms to develop a novel hierarchical FL framework based on the alternating direction method of multipliers (ADMM). Within this framework, we propose two novel FL algorithms, which both use ADMM in the top layer: one that employs ADMM in the lower layer and another that uses the conventional gradient descent-based approach. The proposed framework enhances privacy, and experiments demonstrate the superiority of the proposed algorithms compared to the conventional algorithms in terms of learning convergence and accuracy. Additionally, gradient descent on the lower layer performs well even if the number of local steps is very limited, while ADMM on both layers lead to better performance otherwise.
翻译:本文摒弃了广泛使用的基于梯度下降的分层联邦学习算法,提出了一种基于交替方向乘子法的新型分层联邦学习框架。在此框架内,我们提出了两种新颖的联邦学习算法,二者均在上层使用ADMM:一种在下层采用ADMM,另一种则采用传统的基于梯度下降的方法。所提框架增强了隐私保护,实验表明与传统算法相比,所提算法在学习收敛速度和精度方面均表现出优越性。此外,即使局部更新步数非常有限,下层采用梯度下降仍能取得良好性能;而在其他情况下,上下层均采用ADMM可获得更优的性能表现。