High-energy physics requires the generation of large numbers of simulated data samples from complex but analytically tractable distributions called matrix elements. Surrogate models, such as normalizing flows, are gaining popularity for this task due to their computational efficiency. We adopt an approach based on Flow Annealed importance sampling Bootstrap (FAB) that evaluates the differentiable target density during training and helps avoid the costly generation of training data in advance. We show that FAB reaches higher sampling efficiency with fewer target evaluations in high dimensions in comparison to other methods.
翻译:高能物理学需要从复杂但解析可处理的分布(称为矩阵元)中生成大量模拟数据样本。归一化流等代理模型因其计算效率而在此任务中日益受到青睐。我们采用基于流退火重要性采样引导(FAB)的方法,该方法在训练期间评估可微目标密度,有助于避免预先生成成本高昂的训练数据。我们证明,与其他方法相比,FAB在高维情况下能以更少的目标评估次数达到更高的采样效率。