Classic Bayesian methods with complex models are frequently infeasible due to an intractable likelihood. Simulation-based inference methods, such as Approximate Bayesian Computing (ABC), calculate posteriors without accessing a likelihood function by leveraging the fact that data can be quickly simulated from the model, but converge slowly and/or poorly in high-dimensional settings. In this paper, we propose a framework for Bayesian posterior estimation by mapping data to posteriors of parameters using a neural network trained on data simulated from the complex model. Posterior distributions of model parameters are efficiently obtained by feeding observed data into the trained neural network. We show theoretically that our posteriors converge to the true posteriors in Kullback-Leibler divergence. Our approach yields computationally efficient and theoretically justified uncertainty quantification, which is lacking in existing simulation-based neural network approaches. Comprehensive simulation studies highlight our method's robustness and accuracy.
翻译:经典贝叶斯方法在处理复杂模型时,常因似然函数难以处理而无法实现。基于模拟的推断方法(如近似贝叶斯计算)通过利用模型可快速模拟生成数据的特点,在不依赖似然函数的情况下计算后验分布,但在高维场景中收敛速度缓慢且效果欠佳。本文提出一种贝叶斯后验估计框架,该框架使用在复杂模型模拟数据上训练的神经网络,将数据映射至参数的后验分布。通过将观测数据输入训练好的神经网络,可高效获取模型参数的后验分布。我们从理论上证明了该方法所得后验分布在Kullback-Leibler散度意义下收敛于真实后验分布。本方法实现了计算高效且理论依据充分的不确定性量化,弥补了现有基于模拟的神经网络方法的不足。综合模拟研究验证了本方法的鲁棒性与准确性。