Federated Learning (FL) allows devices to train a global machine learning model without sharing data. In the context of wireless networks, the inherently unreliable nature of the transmission channel introduces delays and errors that compromise the regularity of updating the global model. Furthermore, limited resources and energy consumption of devices are factors that affect FL performance. Therefore, this work proposes a new FL algorithm called FL-E2WS that considers both the requirements of federated training and a wireless network within the scope of the Internet of Things. To reduce the energy cost of devices, FL-E2WS schedules communication resources to allocate the ideal bandwidth and power for the transmission of models under certain device selection and uplink resource block allocation, meeting delay requirements, power consumption, and packet error rate. The simulation results demonstrate that FL-E2WS reduces energy consumption by up to 70.12% and enhances the accuracy of the global model by up to 10.21% compared to the FL algorithms that lacks transmission channel knowledge. Additionally, when compared to FL versions that scale communication resources, FL-E2WS achieves up to a 38.61% reduction in energy consumption and improves the accuracy of the global model by up to 1.61%.
翻译:联邦学习(FL)允许设备在不共享数据的情况下训练全局机器学习模型。在无线网络环境中,传输信道固有的不可靠性会引入延迟和误差,从而影响全局模型更新的规律性。此外,设备的有限资源和能耗也是影响FL性能的因素。因此,本文提出了一种名为FL-E2WS的新型FL算法,该算法在物联网范畴内同时考虑了联邦训练的需求和无线网络的特性。为降低设备能耗,FL-E2WS通过调度通信资源,在特定设备选择和上行链路资源块分配条件下,为模型传输分配理想带宽与功率,以满足延迟要求、功耗约束及误包率指标。仿真结果表明,相较于缺乏传输信道知识的FL算法,FL-E2WS可降低高达70.12%的能耗,并将全局模型精度提升最高达10.21%。此外,与采用通信资源缩放策略的FL版本相比,FL-E2WS能进一步实现最高38.61%的能耗降低,并使全局模型精度提升最高达1.61%。