This paper introduces and analyses interacting underdamped Langevin algorithms, termed Kinetic Interacting Particle Langevin Monte Carlo (KIPLMC) methods, for statistical inference in latent variable models. We propose a diffusion process that evolves jointly in the space of parameters and latent variables and exploit the fact that the stationary distribution of this diffusion concentrates around the maximum marginal likelihood estimate of the parameters. We then provide two explicit discretisations of this diffusion as practical algorithms to estimate parameters of statistical models. For each algorithm, we obtain nonasymptotic rates of convergence for the case where the joint log-likelihood is strongly concave with respect to latent variables and parameters. In particular, we provide convergence analysis for the diffusion together with the discretisation error, providing convergence rate estimates for the algorithms in Wasserstein-2 distance. To demonstrate the utility of the introduced methodology, we provide numerical experiments that demonstrate the effectiveness of the proposed diffusion for statistical inference and the stability of the numerical integrators utilised for discretisation. Our setting covers a broad number of applications, including unsupervised learning, statistical inference, and inverse problems.
翻译:本文提出并分析了一类用于潜变量模型统计推断的交互欠阻尼朗之万算法,称为动力学交互粒子朗之万蒙特卡洛(KIPLMC)方法。我们构建了一个在参数与潜变量联合空间中演化的扩散过程,并利用该扩散过程的平稳分布集中于参数最大边际似然估计的性质。随后,我们给出了该扩散过程的两种显式离散化方案,作为估计统计模型参数的实用算法。针对每种算法,在联合对数似然函数关于潜变量和参数均为强凹的条件下,我们获得了其非渐近收敛速率。特别地,我们提供了扩散过程及其离散化误差的收敛性分析,给出了算法在Wasserstein-2距离下的收敛速率估计。为展示所提方法的实用性,我们通过数值实验验证了所构建扩散过程在统计推断中的有效性,以及离散化所用数值积分器的稳定性。本框架涵盖广泛的应用领域,包括无监督学习、统计推断及反问题求解。