Maximum marginal likelihood estimation (MMLE) can be formulated as the optimization of a free energy functional. From this viewpoint, the Expectation-Maximisation (EM) algorithm admits a natural interpretation as a coordinate descent method over the joint space of model parameters and probability measures. Recently, a significant body of work has adopted this perspective, leading to interacting particle algorithms for MMLE. In this paper, we propose an accelerated version of one such procedure, based on Stein variational gradient descent (SVGD), by introducing Nesterov acceleration in both the parameter updates and in the space of probability measures. The resulting method, termed Momentum SVGD-EM, consistently accelerates convergence in terms of required iterations across various tasks of increasing difficulty, demonstrating effectiveness in both low- and high-dimensional settings.
翻译:最大边际似然估计(MMLE)可以表述为自由能泛函的优化问题。从这个观点出发,期望最大化(EM)算法可以被自然地解释为在模型参数和概率测度的联合空间上进行坐标下降的方法。近期,大量研究工作采用了这一视角,从而发展出用于MMLE的交互粒子算法。本文提出了一种基于斯坦因变分梯度下降(SVGD)的此类程序的加速版本,通过在参数更新和概率测度空间中引入涅斯捷罗夫加速技术。所提出的方法称为动量SVGD-EM,在难度递增的各种任务中,就所需迭代次数而言持续加速收敛,证明了其在低维和高维设置下的有效性。