We propose an uplink over-the-air aggregation (OAA) method for wireless federated learning (FL) that simultaneously trains multiple models. To maximize the multi-model training convergence rate, we derive an upper bound on the optimality gap of the global model update, and then, formulate an uplink joint transmit-receive beamforming optimization problem to minimize this upper bound. We solve this problem using the block coordinate descent approach, which admits low-complexity closed-form updates. Simulation results show that our proposed multi-model FL with fast OAA substantially outperforms sequentially training multiple models under the conventional single-model approach.
翻译:本文提出了一种用于无线联邦学习(FL)的上行链路空中聚合(OAA)方法,该方法可同时训练多个模型。为了最大化多模型训练的收敛速度,我们推导了全局模型更新最优性间隙的一个上界,并据此构建了一个上行链路联合收发波束成形优化问题,以最小化该上界。我们采用块坐标下降法求解此问题,该方法允许低复杂度的闭式更新。仿真结果表明,与采用传统单模型方法顺序训练多个模型相比,我们提出的基于快速OAA的多模型FL方法性能显著更优。