This work addresses a key limitation in current federated learning approaches, which predominantly focus on homogeneous tasks, neglecting the task diversity on local devices. We propose a principled integration of multi-task learning using multi-output Gaussian processes (MOGP) at the local level and federated learning at the global level. MOGP handles correlated classification and regression tasks, offering a Bayesian non-parametric approach that naturally quantifies uncertainty. The central server aggregates the posteriors from local devices, updating a global MOGP prior redistributed for training local models until convergence. Challenges in performing posterior inference on local devices are addressed through the P\'{o}lya-Gamma augmentation technique and mean-field variational inference, enhancing computational efficiency and convergence rate. Experimental results on both synthetic and real data demonstrate superior predictive performance, OOD detection, uncertainty calibration and convergence rate, highlighting the method's potential in diverse applications. Our code is publicly available at https://github.com/JunliangLv/task_diversity_BFL.
翻译:本研究针对当前联邦学习方法主要关注同质任务、忽视本地设备任务多样性的关键局限。我们提出一种原则性集成方案:在本地层面采用多输出高斯过程进行多任务学习,在全局层面实施联邦学习。多输出高斯过程处理相关的分类与回归任务,提供了一种贝叶斯非参数方法,能够自然量化不确定性。中央服务器聚合来自本地设备的后验分布,更新全局多输出高斯过程先验并重新分发给本地模型进行训练,直至收敛。通过Pólya-Gamma增强技术和平均场变分推断解决了本地设备后验推断的计算难题,显著提升了计算效率与收敛速度。在合成数据与真实数据上的实验结果表明,该方法在预测性能、分布外检测、不确定性校准和收敛速度方面均表现优异,凸显了其在多样化应用场景中的潜力。我们的代码已公开于https://github.com/JunliangLv/task_diversity_BFL。