Hamiltonian Monte Carlo (HMC) is a state of the art method for sampling from distributions with differentiable densities, but can converge slowly when applied to challenging multimodal problems. Running HMC with a time varying Hamiltonian, in order to interpolate from an initial tractable distribution to the target of interest, can address this problem. In conjunction with a weighting scheme to eliminate bias, this can be viewed as a special case of Sequential Monte Carlo (SMC) sampling \cite{doucet2001introduction}. However, this approach can be inefficient, since it requires slow change between the initial and final distribution. Inspired by \cite{sels2017minimizing}, where a learned \emph{counterdiabatic} term added to the Hamiltonian allows for efficient quantum state preparation, we propose \emph{Counterdiabatic Hamiltonian Monte Carlo} (CHMC), which can be viewed as an SMC sampler with a more efficient kernel. We establish its relationship to recent proposals for accelerating gradient-based sampling with learned drift terms, and demonstrate on simple benchmark problems.
翻译:哈密顿蒙特卡洛(HMC)是一种从具有可微密度的分布中采样的先进方法,但在应用于具有挑战性的多模态问题时可能收敛缓慢。通过运行时变哈密顿量的HMC,以从初始易处理的分布插值到感兴趣的目标分布,可以解决此问题。结合消除偏差的加权方案,这可以被视为序贯蒙特卡洛(SMC)采样的一种特例 \cite{doucet2001introduction}。然而,这种方法可能效率低下,因为它需要在初始分布和最终分布之间进行缓慢的转变。受 \cite{sels2017minimizing} 的启发,其中向哈密顿量添加一个习得的\emph{反绝热}项可以实现高效的量子态制备,我们提出了\emph{反绝热哈密顿蒙特卡洛}(CHMC),它可以被视为一种具有更高效核的SMC采样器。我们建立了其与近期提出的利用习得漂移项加速基于梯度的采样方法之间的联系,并在简单的基准问题上进行了验证。