Recent research has focused on designing neural samplers that amortize the process of sampling from unnormalized densities. However, despite significant advancements, they still fall short of the state-of-the-art MCMC approach, Parallel Tempering (PT), when it comes to the efficiency of target evaluations. On the other hand, unlike a well-trained neural sampler, PT yields only dependent samples and needs to be rerun -- at considerable computational cost -- whenever new samples are required. To address these weaknesses, we propose the Progressive Tempering Sampler with Diffusion (PTSD), which trains diffusion models sequentially across temperatures, leveraging the advantages of PT to improve the training of neural samplers. We also introduce a novel method to combine high-temperature diffusion models to generate approximate lower-temperature samples, which are minimally refined using MCMC and used to train the next diffusion model. PTSD enables efficient reuse of sample information across temperature levels while generating well-mixed, uncorrelated samples. Our method significantly improves target evaluation efficiency, outperforming diffusion-based neural samplers.
翻译:近期研究聚焦于设计能够分摊从非归一化密度中采样过程的神经采样器。然而,尽管取得了显著进展,在目标评估效率方面,这些方法仍落后于最先进的MCMC方法——并行回火(PT)。另一方面,与训练良好的神经采样器不同,PT仅产生相关样本,且每当需要新样本时都必须重新运行,计算成本高昂。为克服这些不足,我们提出了基于扩散的渐进回火采样器(PTSD),该方法通过跨温度序列训练扩散模型,利用PT的优势改进神经采样器的训练。我们还提出一种创新方法,通过组合高温扩散模型生成近似低温样本,这些样本经过MCMC最小化精炼后用于训练下一个扩散模型。PTSD能够在不同温度层级间高效复用样本信息,同时生成混合良好且不相关的样本。我们的方法显著提升了目标评估效率,其性能优于基于扩散的神经采样器。