In this work, we propose a first-order sampling method called the Metropolis-adjusted Preconditioned Langevin Algorithm for approximate sampling from a target distribution whose support is a proper convex subset of $\mathbb{R}^{d}$. Our proposed method is the result of applying a Metropolis-Hastings filter to the Markov chain formed by a single step of the preconditioned Langevin algorithm with a metric $\mathscr{G}$, and is motivated by the natural gradient descent algorithm for optimisation. We derive non-asymptotic upper bounds for the mixing time of this method for sampling from target distributions whose potentials are bounded relative to $\mathscr{G}$, and for exponential distributions restricted to the support. Our analysis suggests that if $\mathscr{G}$ satisfies stronger notions of self-concordance introduced in Kook and Vempala (2024), then these mixing time upper bounds have a strictly better dependence on the dimension than when is merely self-concordant. We also provide numerical experiments that demonstrates the practicality of our proposed method. Our method is a high-accuracy sampler due to the polylogarithmic dependence on the error tolerance in our mixing time upper bounds.
翻译:本文提出了一种称为Metropolis调整的预处理Langevin算法的一阶采样方法,用于从支撑集为$\mathbb{R}^{d}$中真凸子集的目标分布中进行近似采样。该方法通过对具有度量$\mathscr{G}$的预处理Langevin算法单步形成的马尔可夫链应用Metropolis-Hastings滤波器而得到,其设计灵感来源于优化中的自然梯度下降算法。我们推导了该方法在采样势函数相对于$\mathscr{G}$有界的目标分布以及支撑集受限的指数分布时的非渐近混合时间上界。分析表明,若$\mathscr{G}$满足Kook和Vempala(2024)提出的更强自协和性条件,则这些混合时间上界对维度的依赖关系将严格优于仅满足自协和性条件的情况。数值实验验证了所提方法的实用性。由于混合时间上界对误差容忍度具有多对数依赖特性,本方法可实现高精度采样。