Federated Learning (FL) has emerged as a solution for distributed model training across decentralized, privacy-preserving devices, but the different energy capacities of participating devices (system heterogeneity) constrain real-world implementations. These energy limitations not only reduce model accuracy but also increase dropout rates, impacting on convergence in practical FL deployments. In this work, we propose LeanFed, an energy-aware FL framework designed to optimize client selection and training workloads on battery-constrained devices. LeanFed leverages adaptive data usage by dynamically adjusting the fraction of local data each device utilizes during training, thereby maximizing device participation across communication rounds while ensuring they do not run out of battery during the process. We rigorously evaluate LeanFed against traditional FedAvg on CIFAR-10 and CIFAR-100 datasets, simulating various levels of data heterogeneity and device participation rates. Results show that LeanFed consistently enhances model accuracy and stability, particularly in settings with high data heterogeneity and limited battery life, by mitigating client dropout and extending device availability. This approach demonstrates the potential of energy-efficient, privacy-preserving FL in real-world, large-scale applications, setting a foundation for robust and sustainable pervasive AI on resource-constrained networks.
翻译:联邦学习(Federated Learning, FL)已成为一种在去中心化、保护隐私的设备间进行分布式模型训练的解决方案,但参与设备不同的能量容量(系统异构性)限制了其在实际场景中的部署。这些能量限制不仅会降低模型精度,还会增加设备退出率,从而影响实际联邦学习部署的收敛性。本文提出LeanFed,一种能量感知的联邦学习框架,旨在优化电池受限设备上的客户端选择与训练工作负载。LeanFed通过动态调整每个设备在训练期间所使用的本地数据比例,实现自适应数据利用,从而在确保设备不会在训练过程中耗尽电量的前提下,最大化其在多轮通信中的参与度。我们在CIFAR-10和CIFAR-100数据集上,模拟不同程度的数据异构性与设备参与率,将LeanFed与传统FedAvg进行严格对比评估。结果表明,LeanFed通过减少客户端退出并延长设备可用时间,能够持续提升模型精度与稳定性,尤其在数据异构性高、电池续航有限的场景中效果显著。该方法展示了高能效、保护隐私的联邦学习在现实世界大规模应用中的潜力,为在资源受限网络上实现稳健且可持续的普适人工智能奠定了基础。