We derive a novel generative model from iterative Gaussian posterior inference. By treating the generated sample as an unknown variable, we can formulate the sampling process in the language of Bayesian probability. Our model uses a sequence of prediction and posterior update steps to iteratively narrow down the unknown sample starting from a broad initial belief. In addition to a rigorous theoretical analysis, we establish a connection between our model and diffusion models and show that it includes Bayesian Flow Networks (BFNs) as a special case. In our experiments, we demonstrate that our model improves sample quality on ImageNet32 over both BFNs and the closely related Variational Diffusion Models, while achieving equivalent log-likelihoods on ImageNet32 and ImageNet64. Find our code at https://github.com/martenlienen/bsi.
翻译:我们通过迭代高斯后验推断推导出一种新颖的生成模型。通过将生成样本视为未知变量,我们能够用贝叶斯概率的语言来形式化采样过程。该模型采用一系列预测与后验更新步骤,从宽泛的初始信念出发迭代地缩小未知样本的范围。除严格的理论分析外,我们建立了该模型与扩散模型之间的关联,并证明其将贝叶斯流网络(BFNs)包含为特例。实验结果表明,在ImageNet32数据集上,我们的模型在样本质量方面优于BFNs及密切相关的变分扩散模型,同时在ImageNet32和ImageNet64数据集上取得了同等的对数似然值。代码发布于https://github.com/martenlienen/bsi。