Recent advancements in large artificial intelligence models (LAMs) are driving significant innovations in mobile edge computing within next-generation wireless networks. However, the substantial demands for computational resources and larges-cale training data required to train LAMs conflict with the limited storage and computational capacity of edge devices, posing significant challenges to training and deploying LAMs at the edge. In this work, we introduce the Networked Mixture-of-Experts (NMoE) system, in which clients perform inference collaboratively by distributing tasks to suitable neighbors based on their expertise and aggregate the returned results. For training the NMoE, we propose a federated learning framework that integrates both supervised and self-supervised learning to balance personalization and generalization, while preserving communication efficiency and data privacy. We conduct extensive experiments to demonstrate the efficacy of the proposed NMoE system, providing insights for the NMoE training algorithms.
翻译:近年来,大型人工智能模型(LAMs)的进展正推动下一代无线网络中移动边缘计算的重大创新。然而,训练LAMs所需的大量计算资源和大规模训练数据与边缘设备有限的存储和计算能力相冲突,这给在边缘训练和部署LAMs带来了重大挑战。在本工作中,我们提出了网络专家混合(NMoE)系统,其中客户端通过根据其专业知识将任务分发给合适的邻居来协作执行推理,并聚合返回的结果。为了训练NMoE,我们提出了一个联邦学习框架,该框架整合了监督学习和自监督学习,以平衡个性化和泛化,同时保持通信效率和数据隐私。我们进行了广泛的实验以证明所提出的NMoE系统的有效性,并为NMoE训练算法提供了见解。