Assume that we would like to estimate the expected value of a function $F$ with respect to a density $\pi$. We prove that if $\pi$ is close enough under KL divergence to another density $q$, an independent Metropolis sampler estimator that obtains samples from $\pi$ with proposal density $q$, enriched with a variance reduction computational strategy based on control variates, achieves smaller asymptotic variance than that of the crude Monte Carlo estimator. The control variates construction requires no extra computational effort but assumes that the expected value of $F$ under $q$ is analytically available. We illustrate this result by calculating the marginal likelihood in a linear regression model with prior-likelihood conflict and a non-conjugate prior. Furthermore, we propose an adaptive independent Metropolis algorithm that adapts the proposal density such that its KL divergence with the target is being reduced. We demonstrate its applicability in a Bayesian logistic and Gaussian process regression problems and we rigorously justify our asymptotic arguments under easily verifiable and essentially minimal conditions.
翻译:假设我们需要估计函数 $F$ 关于密度 $\pi$ 的期望值。我们证明,若 $\pi$ 在KL散度意义下与另一密度 $q$ 足够接近,则采用提案密度 $q$ 从 $\pi$ 中采样的独立Metropolis采样器估计量,结合基于控制变量的方差缩减计算策略,能够获得比原始蒙特卡洛估计量更小的渐近方差。该控制变量的构建无需额外计算成本,但要求 $F$ 在 $q$ 下的期望值可解析求得。我们通过计算具有先验-似然冲突和非共轭先验的线性回归模型中的边际似然来例证此结果。此外,我们提出一种自适应独立Metropolis算法,该算法通过调整提案密度以减小其与目标分布的KL散度。我们在贝叶斯逻辑回归和高斯过程回归问题中展示了该算法的适用性,并在易于验证且本质上最简的条件下严格证明了相关渐近结论。