This paper studies the optimization of the KL functional on the Wasserstein space of probability measures, and develops a sampling framework based on Wasserstein gradient descent (WGD). We identify two important subclasses of the Wasserstein space for which the WGD scheme is guaranteed to converge, thereby providing new theoretical foundations for optimization-based sampling methods on measure spaces. For practical implementation, we construct a particle-based WGD algorithm in which the score function is estimated via score matching. Through a series of numerical experiments, we demonstrate that WGD can provide good approximation to a variety of complex target distributions, including those that pose substantial challenges for standard MCMC and parametric variational Bayes methods. These results suggest that WGD offers a promising and flexible alternative for scalable Bayesian inference in high-dimensional or multimodal settings.
翻译:本文研究了概率测度Wasserstein空间上KL泛函的优化问题,并发展了一种基于Wasserstein梯度下降(WGD)的采样框架。我们识别了Wasserstein空间中两个重要的子类,证明WGD方案在这些子类上具有收敛保证,从而为测度空间上基于优化的采样方法提供了新的理论基础。为实现实际应用,我们构建了基于粒子的WGD算法,其中通过分数匹配估计得分函数。通过一系列数值实验,我们证明WGD能够有效逼近多种复杂目标分布,包括那些对标准MCMC方法和参数化变分贝叶斯方法构成显著挑战的分布。这些结果表明,在高维或多模态场景下,WGD为可扩展的贝叶斯推断提供了一种具有前景且灵活的替代方案。