Approximate Bayesian Computation (ABC) methods are commonly used to approximate posterior distributions in models with unknown or computationally intractable likelihoods. Classical ABC methods are based on nearest neighbor type algorithms and rely on the choice of so-called summary statistics, distances between datasets and a tolerance threshold. Recently, methods combining ABC with more complex machine learning algorithms have been proposed to mitigate the impact of these "user-choices". In this paper, we propose the first, to our knowledge, ABC method completely free of summary statistics, distance and tolerance threshold. Moreover, in contrast with usual generalizations of the ABC method, it associates a confidence interval (having a proper frequentist marginal coverage) with the posterior mean estimation (or other moment-type estimates). Our method, ABCD-Conformal, uses a neural network with Monte Carlo Dropout to provide an estimation of the posterior mean (or others moment type functional), and conformal theory to obtain associated confidence sets. Efficient for estimating multidimensional parameters, we test this new method on three different applications and compare it with other ABC methods in the literature.
翻译:近似贝叶斯计算(ABC)方法通常用于在似然函数未知或计算不可行的模型中逼近后验分布。经典的ABC方法基于最近邻类算法,并依赖于所谓摘要统计量的选择、数据集间的距离以及容差阈值。最近,已有研究提出将ABC与更复杂的机器学习算法相结合的方法,以减轻这些“用户选择”的影响。本文提出了据我们所知首个完全无需摘要统计量、距离和容差阈值的ABC方法。此外,与通常的ABC方法推广不同,该方法将具有适当频率派边际覆盖度的置信区间与后验均值估计(或其他矩型估计)相结合。我们的方法ABCD-Conformal使用带有蒙特卡洛Dropout的神经网络来提供后验均值(或其他矩型泛函)的估计,并利用保形理论获得相应的置信集。该方法适用于多维参数估计,我们在三个不同应用中测试了这一新方法,并与文献中的其他ABC方法进行了比较。