The conjunction of edge intelligence and the ever-growing Internet-of-Things (IoT) network heralds a new era of collaborative machine learning, with federated learning (FL) emerging as the most prominent paradigm. With the growing interest in these learning schemes, researchers started addressing some of their most fundamental limitations. Indeed, conventional FL with a central aggregator presents a single point of failure and a network bottleneck. To bypass this issue, decentralized FL where nodes collaborate in a peer-to-peer network has been proposed. Despite the latter's efficiency, communication costs and data heterogeneity remain key challenges in decentralized FL. In this context, we propose a novel scheme, called opportunistic communication-efficient decentralized federated learning, a.k.a., OCD-FL, consisting of a systematic FL peer selection for collaboration, aiming to achieve maximum FL knowledge gain while reducing energy consumption. Experimental results demonstrate the capability of OCD-FL to achieve similar or better performances than the fully collaborative FL, while significantly reducing consumed energy by at least 30% and up to 80%.
翻译:边缘智能与不断增长的物联网(IoT)网络相结合,开启了协作机器学习的新时代,其中联邦学习(FL)已成为最突出的范式。随着对这些学习方案兴趣的增长,研究人员开始解决它们的一些根本局限性。事实上,传统的带有中央聚合器的联邦学习存在单点故障和网络瓶颈问题。为了绕过这一问题,提出了节点在对等网络中协作的去中心化联邦学习。尽管后者具有效率,但通信成本和数据异构性仍然是去中心化联邦学习中的关键挑战。在此背景下,我们提出了一种名为机会主义通信高效去中心化联邦学习(简称OCD-FL)的新方案,该方案包括一种系统的联邦学习对等选择协作机制,旨在在降低能耗的同时实现最大化的联邦学习知识增益。实验结果表明,OCD-FL能够达到与完全协作式联邦学习相似或更好的性能,同时显著降低能耗至少30%,最高可达80%。