The Gibbs sampler (a.k.a. Glauber dynamics and heat-bath algorithm) is a popular Markov Chain Monte Carlo algorithm which iteratively samples from the conditional distributions of a probability measure $\pi$ of interest. Under the assumption that $\pi$ is strongly log-concave, we show that the random scan Gibbs sampler contracts in relative entropy and provide a sharp characterization of the associated contraction rate. Assuming that evaluating conditionals is cheap compared to evaluating the joint density, our results imply that the number of full evaluations of $\pi$ needed for the Gibbs sampler to mix grows linearly with the condition number and is independent of the dimension. If $\pi$ is non-strongly log-concave, the convergence rate in entropy degrades from exponential to polynomial. Our techniques are versatile and extend to Metropolis-within-Gibbs schemes and the Hit-and-Run algorithm. A comparison with gradient-based schemes and the connection with the optimization literature are also discussed.
翻译:吉布斯采样器(又称Glauber动力学和热浴算法)是一种流行的马尔可夫链蒙特卡罗算法,它迭代地从目标概率测度$\pi$的条件分布中采样。在假设$\pi$为强对数凹的条件下,我们证明随机扫描吉布斯采样器在相对熵意义下具有收缩性,并给出了相应收缩率的精确刻画。假设条件分布求值比联合密度求值计算代价更低,我们的结果表明吉布斯采样器达到混合所需对$\pi$的完整求值次数随条件数线性增长,而与维度无关。若$\pi$非强对数凹,熵意义下的收敛速率将从指数级退化为多项式级。我们的技术具有普适性,可推广至Gibbs框架内的Metropolis算法及Hit-and-Run算法。文中还讨论了与基于梯度方案的比较以及与优化文献的关联。