We present Surjective Sequential Neural Likelihood (SSNL) estimation, a novel method for simulation-based inference in models where the evaluation of the likelihood function is not tractable and only a simulator that can generate synthetic data is available. SSNL fits a dimensionality-reducing surjective normalizing flow model and uses it as a surrogate likelihood function which allows for conventional Bayesian inference using either Markov chain Monte Carlo methods or variational inference. By embedding the data in a low-dimensional space, SSNL solves several issues previous likelihood-based methods had when applied to high-dimensional data sets that, for instance, contain non-informative data dimensions or lie along a lower-dimensional manifold. We evaluate SSNL on a wide variety of experiments and show that it generally outperforms contemporary methods used in simulation-based inference, for instance, on a challenging real-world example from astrophysics which models the magnetic field strength of the sun using a solar dynamo model.
翻译:我们提出了满射序贯神经似然(SSNL)估计方法,这是一种用于模拟推断的新方法,适用于似然函数难以评估且仅能通过模拟器生成合成数据的模型。SSNL 拟合了一个降维的满射归一化流模型,并将其作为替代似然函数,从而能够通过马尔可夫链蒙特卡洛方法或变分推断实现传统的贝叶斯推断。通过将数据嵌入低维空间,SSNL解决了以往基于似然的方法在应用于包含无信息数据维度或位于低维流形上的高维数据集时面临的多个问题。我们在多种实验上对SSNL进行了评估,结果表明,在模拟推断领域,该方法通常优于现有技术,例如,在来自天体物理学的一个挑战性真实世界例子中,SSNL使用太阳发电机模型模拟了太阳的磁场强度。