Bayesian inference for doubly intractable distributions is challenging because they include intractable terms, which are functions of parameters of interest. Although several alternatives have been developed for such models, they are computationally intensive due to repeated auxiliary variable simulations. We propose a novel Monte Carlo Stein variational gradient descent (MC-SVGD) approach for inference for doubly intractable distributions. Through an efficient gradient approximation, our MC-SVGD approach rapidly transforms an arbitrary reference distribution to approximate the posterior distribution of interest, without necessitating any predefined variational distribution class for the posterior. Such a transport map is obtained by minimizing Kullback-Leibler divergence between the transformed and posterior distributions in a reproducing kernel Hilbert space (RKHS). We also investigate the convergence rate of the proposed method. We illustrate the application of the method to challenging examples, including a Potts model, an exponential random graph model, and a Conway--Maxwell--Poisson regression model. The proposed method achieves substantial computational gains over existing algorithms, while providing comparable inferential performance for the posterior distributions.
翻译:贝叶斯推断在双重难处理分布上面临挑战,因为这类分布包含难以处理的项,这些项是感兴趣参数的函数。尽管针对此类模型已发展出若干替代方法,但由于需要重复模拟辅助变量,它们都存在计算密集的问题。本文提出一种新颖的蒙特卡洛Stein变分梯度下降方法,用于双重难处理分布的推断。通过高效的梯度逼近,我们的MC-SVGD方法能够将任意参考分布快速变换为目标后验分布的近似,而无需为后验分布预设任何变分分布族。该传输映射通过在再生核希尔伯特空间中最小化变换分布与后验分布之间的Kullback-Leibler散度获得。我们还研究了所提方法的收敛速率。通过具有挑战性的实例展示了该方法的应用,包括Potts模型、指数随机图模型以及Conway--Maxwell--Poisson回归模型。所提方法在保持后验分布推断性能可比的前提下,较现有算法实现了显著的计算效率提升。