We study the task of efficiently sampling from a Gibbs distribution $d \pi^* = e^{-h} d {vol}_g$ over a Riemannian manifold $M$ via (geometric) Langevin MCMC; this algorithm involves computing exponential maps in random Gaussian directions and is efficiently implementable in practice. The key to our analysis of Langevin MCMC is a bound on the discretization error of the geometric Euler-Murayama scheme, assuming $\nabla h$ is Lipschitz and $M$ has bounded sectional curvature. Our error bound matches the error of Euclidean Euler-Murayama in terms of its stepsize dependence. Combined with a contraction guarantee for the geometric Langevin Diffusion under Kendall-Cranston coupling, we prove that the Langevin MCMC iterates lie within $\epsilon$-Wasserstein distance of $\pi^*$ after $\tilde{O}(\epsilon^{-2})$ steps, which matches the iteration complexity for Euclidean Langevin MCMC. Our results apply in general settings where $h$ can be nonconvex and $M$ can have negative Ricci curvature. Under additional assumptions that the Riemannian curvature tensor has bounded derivatives, and that $\pi^*$ satisfies a $CD(\cdot,\infty)$ condition, we analyze the stochastic gradient version of Langevin MCMC, and bound its iteration complexity by $\tilde{O}(\epsilon^{-2})$ as well.
翻译:我们研究了在黎曼流形M上通过(几何)Langevin MCMC从吉布斯分布$d \pi^* = e^{-h} d {vol}_g$高效采样的任务;该算法涉及在随机高斯方向上计算指数映射,并且在实践中可高效实现。我们对Langevin MCMC分析的关键在于几何Euler-Murayama格式的离散化误差上界,该上界假设$\nabla h$是Lipschitz连续的且M具有有界截面曲率。我们的误差上界在步长依赖性方面与欧几里得Euler-Murayama格式的误差相匹配。结合Kendall-Cranston耦合下几何Langevin扩散的收缩保证,我们证明Langevin MCMC迭代在$\tilde{O}(\epsilon^{-2})$步后与$\pi^*$的$\epsilon$-Wasserstein距离内,这与欧几里得Langevin MCMC的迭代复杂度一致。我们的结果适用于h可以是非凸的且M可以具有负里奇曲率的一般设定。在黎曼曲率张量具有有界导数且$\pi^*$满足$CD(\cdot,\infty)$条件的额外假设下,我们分析了Langevin MCMC的随机梯度版本,并同样将其迭代复杂度限定为$\tilde{O}(\epsilon^{-2})$。