Unsupervised learning has been widely used in many real-world applications. One of the simplest and most important unsupervised learning models is the Gaussian mixture model (GMM). In this work, we study the multi-task learning problem on GMMs, which aims to leverage potentially similar GMM parameter structures among tasks to obtain improved learning performance compared to single-task learning. We propose a multi-task GMM learning procedure based on the EM algorithm that effectively utilizes unknown similarities between related tasks and is robust against a fraction of outlier tasks from arbitrary distributions. The proposed procedure is shown to achieve the minimax optimal rate of convergence for both parameter estimation error and the excess mis-clustering error, in a wide range of regimes. Moreover, we generalize our approach to tackle the problem of transfer learning for GMMs, where similar theoretical results are derived. Additionally, iterative unsupervised multi-task and transfer learning methods may suffer from an initialization alignment problem, and two alignment algorithms are proposed to resolve the issue. Finally, we demonstrate the effectiveness of our methods through simulations and real data examples. To the best of our knowledge, this is the first work studying multi-task and transfer learning on GMMs with theoretical guarantees.
翻译:无监督学习已在众多实际应用中得到广泛使用。高斯混合模型(GMM)是最简单且最重要的无监督学习模型之一。本文研究GMM上的多任务学习问题,其目标是通过利用任务间可能相似的GMM参数结构,获得优于单任务学习的学习性能。我们提出了一种基于EM算法的多任务GMM学习流程,该流程能有效利用相关任务间未知的相似性,并对任意分布产生的异常任务具有鲁棒性。研究表明,该流程在多种情况下都能达到参数估计误差和超额误聚类误差的极小极大最优收敛速率。此外,我们将该方法推广至解决GMM的迁移学习问题,并推导出类似的理论结果。另外,迭代式无监督多任务与迁移学习方法可能面临初始化对齐问题,本文提出了两种对齐算法以解决该问题。最后,我们通过仿真和实际数据案例验证了所提方法的有效性。据我们所知,这是首个对GMM多任务与迁移学习进行理论保证的研究工作。