Stein variational gradient descent (SVGD) is a prominent particle-based variational inference method used for sampling a target distribution. SVGD has attracted interest for application in machine-learning techniques such as Bayesian inference. In this paper, we propose novel trainable algorithms that incorporate a deep-learning technique called deep unfolding,into SVGD. This approach facilitates the learning of the internal parameters of SVGD, thereby accelerating its convergence speed. To evaluate the proposed trainable SVGD algorithms, we conducted numerical simulations of three tasks: sampling a one-dimensional Gaussian mixture, performing Bayesian logistic regression, and learning Bayesian neural networks. The results show that our proposed algorithms exhibit faster convergence than the conventional variants of SVGD.
翻译:斯坦变分梯度下降(SVGD)是一种用于目标分布采样的主流基于粒子的变分推断方法。SVGD在机器学习技术(如贝叶斯推断)的应用中引起了关注。本文提出了一种新颖的可训练算法,将称为深度展开的深度学习技术融入SVGD。该方法有助于学习SVGD的内部参数,从而加速其收敛速度。为评估所提出的可训练SVGD算法,我们针对三项任务进行了数值模拟:一维高斯混合采样、贝叶斯逻辑回归以及贝叶斯神经网络学习。结果表明,与常规SVGD变体相比,我们提出的算法展现出更快的收敛速度。