Generative diffusion models have recently emerged as a powerful strategy to perform stochastic sampling in Bayesian inverse problems, delivering remarkably accurate solutions for a wide range of challenging applications. However, diffusion models often require a large number of neural function evaluations per sample in order to deliver accurate posterior samples. As a result, using diffusion models as stochastic samplers for Monte Carlo integration in Bayesian computation can be highly computationally expensive, particularly in applications that require a substantial number of Monte Carlo samples for conducting uncertainty quantification analyses. This cost is especially high in large-scale inverse problems such as computational imaging, which rely on large neural networks that are expensive to evaluate. With quantitative imaging applications in mind, this paper presents a Multilevel Monte Carlo strategy that significantly reduces the cost of Bayesian computation with diffusion models. This is achieved by exploiting cost-accuracy trade-offs inherent to diffusion models to carefully couple models of different levels of accuracy in a manner that significantly reduces the overall cost of the calculation, without reducing the final accuracy. The proposed approach achieves a $4\times$-to-$8\times$ reduction in computational cost w.r.t. standard techniques across three benchmark imaging problems.
翻译:生成扩散模型最近已成为贝叶斯反问题中执行随机采样的强大策略,为一系列具有挑战性的应用提供了极为精确的解决方案。然而,扩散模型通常需要大量的神经网络函数评估才能生成精确的后验样本。因此,在贝叶斯计算中将扩散模型用作蒙特卡洛积分的随机采样器可能计算成本极高,尤其是在需要进行大量蒙特卡洛样本以进行不确定性量化分析的应用中。这种成本在计算成像等大规模反问题中尤为突出,这些问题依赖于评估成本高昂的大型神经网络。着眼于定量成像应用,本文提出了一种多级蒙特卡洛策略,可显著降低使用扩散模型进行贝叶斯计算的计算成本。该策略通过利用扩散模型固有的成本-精度权衡机制,将不同精度等级的模型进行精心耦合,从而在保持最终精度的前提下显著降低整体计算成本。在三个基准成像问题上,所提方法相较于标准技术实现了4倍至8倍的计算成本降低。