A long-standing gap exists between the theoretical analysis of Markov chain Monte Carlo convergence, which is often based on statistical divergences, and the diagnostics used in practice. We introduce the first general convergence diagnostics for Markov chain Monte Carlo based on any f-divergence, allowing users to directly monitor, among others, the Kullback--Leibler and the $\chi^2$ divergences as well as the Hellinger and the total variation distances. Our first key contribution is a coupling-based `weight harmonization' scheme that produces a direct, computable, and consistent weighting of interacting Markov chains with respect to their target distribution. The second key contribution is to show how such consistent weightings of empirical measures can be used to provide upper bounds to f-divergences in general. We prove that these bounds are guaranteed to tighten over time and converge to zero as the chains approach stationarity, providing a concrete diagnostic. Numerical experiments demonstrate that our method is a practical and competitive diagnostic tool.
翻译:马尔可夫链蒙特卡洛收敛性的理论分析(通常基于统计散度)与实际应用的诊断工具之间长期存在差距。我们首次提出了基于任意f-散度的通用马尔可夫链蒙特卡洛收敛诊断方法,使用户能够直接监测包括Kullback--Leibler散度、$\chi^2$散度、Hellinger距离以及总变差距离在内的多种度量。我们的第一个核心贡献是一种基于耦合的"权重协调"方案,该方案能够针对目标分布,对交互马尔可夫链产生直接、可计算且一致的加权结果。第二个核心贡献在于论证了如何利用这种经验测度的一致加权来为一般f-散度提供上界。我们证明这些上界随时间推移必然收紧,并随着链趋近平稳态而收敛至零,从而提供具体的诊断依据。数值实验表明,我们的方法是一种实用且具有竞争力的诊断工具。