Approximate Bayesian Computation (ABC) methods are commonly used to approximate posterior distributions in models with unknown or computationally intractable likelihoods. Classical ABC methods are based on nearest neighbor type algorithms and rely on the choice of so-called summary statistics, distances between datasets and a tolerance threshold. Recently, methods combining ABC with more complex machine learning algorithms have been proposed to mitigate the impact of these ``user-choices''. In this paper, we propose the first, to our knowledge, ABC method completely free of summary statistics, distance, and tolerance threshold. Moreover, in contrast with usual generalizations of the ABC method, it associates a confidence interval (having a proper frequentist marginal coverage) with the posterior mean estimation (or other moment-type estimates). Our method, named ABCD-Conformal, uses a neural network with Monte Carlo Dropout to provide an estimation of the posterior mean (or other moment type functionals), and conformal theory to obtain associated confidence sets. Efficient for estimating multidimensional parameters and amortized, we test this new method on four different applications and compare it with other ABC methods in the literature.
翻译:近似贝叶斯计算(ABC)方法常用于似然函数未知或计算不可行的模型中近似后验分布。经典ABC方法基于最近邻类算法,并依赖于所谓摘要统计量、数据集间距离及容差阈值的选择。近期,学界提出了将ABC与更复杂的机器学习算法相结合的方法,以减轻这些“用户选择”的影响。本文提出了据我们所知首个完全无需摘要统计量、距离度量与容差阈值的ABC方法。此外,与通常的ABC推广方法不同,本方法能为后验均值估计(或其他矩型估计量)提供具有适当频率派边际覆盖度的置信区间。我们提出的ABCD-Conformal方法采用带蒙特卡洛Dropout的神经网络实现后验均值(或其他矩型泛函)估计,并利用保形理论获得关联的置信集。该方法适用于多维参数估计且具备摊销特性,我们在四个不同应用场景中测试了这一新方法,并与文献中的其他ABC方法进行了比较。