Stein Variational Gradient Descent (SVGD) is a highly efficient method to sample from an unnormalized probability distribution. However, the SVGD update relies on gradients of the log-density, which may not always be available. Existing gradient-free versions of SVGD make use of simple Monte Carlo approximations or gradients from surrogate distributions, both with limitations. To improve gradient-free Stein variational inference, we combine SVGD steps with evolution strategy (ES) updates. Our results demonstrate that the resulting algorithm generates high-quality samples from unnormalized target densities without requiring gradient information. Compared to prior gradient-free SVGD methods, we find that the integration of the ES update in SVGD significantly improves the performance on multiple challenging benchmark problems.
翻译:Stein变分梯度下降(SVGD)是一种从非归一化概率分布中采样的高效方法。然而,SVGD的更新依赖于对数密度的梯度,而该梯度可能并不总是可获取。现有的无梯度SVGD版本采用简单的蒙特卡洛近似或来自替代分布的梯度,但两者均存在局限性。为改进无梯度Stein变分推断,我们将SVGD步骤与进化策略(ES)更新相结合。实验结果表明,所提出的算法无需梯度信息即可从非归一化目标密度中生成高质量样本。与先前的无梯度SVGD方法相比,我们发现ES更新在SVGD中的集成显著提升了多个具有挑战性的基准问题上的性能。