Approximate Bayesian Computation (ABC) methods are commonly used to approximate posterior distributions in models with unknown or computationally intractable likelihoods. Classical ABC methods are based on nearest neighbor type algorithms and rely on the choice of so-called summary statistics, distances between datasets and a tolerance threshold. Recently, methods combining ABC with more complex machine learning algorithms have been proposed to mitigate the impact of these ``user-choices''. In this paper, we propose the first, to our knowledge, ABC method completely free of summary statistics, distance, and tolerance threshold. Moreover, in contrast with usual generalizations of the ABC method, it associates a confidence interval (having a proper frequentist marginal coverage) with the posterior mean estimation (or other moment-type estimates). Our method, named ABCD-Conformal, uses a neural network with Monte Carlo Dropout to provide an estimation of the posterior mean (or other moment type functionals), and conformal theory to obtain associated confidence sets. Efficient for estimating multidimensional parameters and amortized, we test this new method on four different applications and compare it with other ABC methods in the literature.
翻译:近似贝叶斯计算(ABC)方法常用于在似然函数未知或计算不可行的模型中近似后验分布。经典ABC方法基于最近邻类型算法,并依赖于所谓摘要统计量的选择、数据集间的距离以及容差阈值。最近,已有研究提出将ABC与更复杂的机器学习算法相结合的方法,以减轻这些“用户选择”的影响。本文提出了据我们所知首个完全无需摘要统计量、距离和容差阈值的ABC方法。此外,与通常的ABC方法推广不同,该方法将具有适当频率派边际覆盖度的置信区间与后验均值估计(或其他矩型估计)相结合。我们提出的ABCD-Conformal方法使用带有蒙特卡洛Dropout的神经网络来提供后验均值(或其他矩型泛函)的估计,并利用共形理论获得相应的置信集。该方法适用于多维参数估计且具有摊销特性,我们在四个不同应用场景中测试了这一新方法,并与文献中的其他ABC方法进行了比较。