Black-box variational inference (BBVI) with Gaussian mixture families offers a flexible approach for approximating complex posterior distributions without requiring gradients of the target density. However, standard numerical optimization methods often suffer from instability and inefficiency. We develop a stable and efficient framework that combines three key components: (1) affine-invariant preconditioning via natural gradient formulations, (2) an exponential integrator that unconditionally preserves the positive definiteness of covariance matrices, and (3) adaptive time stepping to ensure stability and to accommodate distinct warm-up and convergence phases. The proposed approach has natural connections to manifold optimization and mirror descent. For Gaussian posteriors, we prove exponential convergence in the noise-free setting and almost-sure convergence under Monte Carlo estimation, rigorously justifying the necessity of adaptive time stepping. Numerical experiments on multimodal distributions, Neal's multiscale funnel, and a PDE-based Bayesian inverse problem for Darcy flow demonstrate the effectiveness of the proposed method.
翻译:基于高斯混合族的黑盒变分推断(BBVI)为近似复杂后验分布提供了一种灵活方法,无需目标密度的梯度信息。然而,标准数值优化方法常面临不稳定与效率低下的问题。我们开发了一个稳定高效的框架,该框架结合了三个关键组成部分:(1)通过自然梯度公式实现的仿射不变预处理,(2)无条件保持协方差矩阵正定性的指数积分器,以及(3)自适应时间步长,以确保稳定性并适应不同的预热与收敛阶段。所提方法与流形优化及镜像下降存在自然联系。对于高斯后验,我们在无噪声设定下证明了指数收敛性,并在蒙特卡洛估计下证明了几乎必然收敛,严格论证了自适应时间步长的必要性。在多峰分布、Neal多尺度漏斗以及基于PDE的达西流贝叶斯反问题上的数值实验,验证了所提方法的有效性。