In the absence of explicit or tractable likelihoods, Bayesians often resort to approximate Bayesian computation (ABC) for inference. Our work bridges ABC with deep neural implicit samplers based on generative adversarial networks (GANs) and adversarial variational Bayes. Both ABC and GANs compare aspects of observed and fake data to simulate from posteriors and likelihoods, respectively. We develop a Bayesian GAN (B-GAN) sampler that directly targets the posterior by solving an adversarial optimization problem. B-GAN is driven by a deterministic mapping learned on the ABC reference by conditional GANs. Once the mapping has been trained, iid posterior samples are obtained by filtering noise at a negligible additional cost. We propose two post-processing local refinements using (1) data-driven proposals with importance reweighting, and (2) variational Bayes. We support our findings with frequentist-Bayesian results, showing that the typical total variation distance between the true and approximate posteriors converges to zero for certain neural network generators and discriminators. Our findings on simulated data show highly competitive performance relative to some of the most recent likelihood-free posterior simulators.
翻译:在缺乏显式或易处理似然函数的情况下,贝叶斯推断常依赖于近似贝叶斯计算(ABC)进行推理。本研究将ABC与基于生成对抗网络(GAN)和对抗变分贝叶斯的深度神经隐式采样器相结合。ABC和GAN分别通过比较观测数据与生成数据的特征来模拟后验分布和似然函数。我们开发了一种贝叶斯GAN(B-GAN)采样器,通过求解对抗性优化问题直接逼近后验分布。B-GAN由条件GAN在ABC参考集上学习的确定性映射驱动。该映射训练完成后,仅需以可忽略的额外计算成本对噪声进行滤波即可获得独立同分布的后验样本。我们提出两种基于(1)数据驱动建议分布与重要性重加权及(2)变分贝叶斯的局部精细化后处理方法。通过频率学派-贝叶斯学派融合论证,我们证明对于特定神经网络生成器和判别器,真实后验与近似后验间的典型全变差距离收敛于零。在模拟数据上的实验表明,相较于当前最先进的无似然后验模拟器,本方法展现出极具竞争力的性能。