The expressiveness of flow-based models combined with stochastic variational inference (SVI) has expanded the application of optimization-based Bayesian inference to highly complex problems. However, despite the importance of multi-model Bayesian inference for problems defined on a transdimensional joint model and parameter space, such as Bayesian structure learning and model selection, flow-based SVI has been limited to problems defined on a fixed-dimensional parameter space. We introduce CoSMIC, normalizing flows (COntextually-Specified Masking for Identity-mapped Components), an extension to neural autoregressive conditional normalizing flow architectures that enables use of a single flow-based variational density for inference over a transdimensional (multi-model) conditional target distribution. We propose a combined stochastic variational transdimensional inference (VTI) approach to training CoSMIC, flows using ideas from Bayesian optimization and Monte Carlo gradient estimation. Numerical experiments show the performance of VTI on challenging problems that scale to high-cardinality model spaces.
翻译:基于流的模型表达能力与随机变分推断(SVI)相结合,已将基于优化的贝叶斯推断的应用范围扩展至高度复杂的问题。然而,尽管多模型贝叶斯推断对于定义在变维联合模型与参数空间上的问题(如贝叶斯结构学习与模型选择)至关重要,但基于流的SVI此前仅限于定义在固定维参数空间上的问题。我们提出了CoSMIC归一化流(上下文指定掩码恒等映射组件),这是对神经自回归条件归一化流架构的一种扩展,使得能够使用单一的基于流的变分密度对变维(多模型)条件目标分布进行推断。我们提出了一种结合随机变分变维推断(VTI)的方法来训练CoSMIC流,该方法借鉴了贝叶斯优化与蒙特卡洛梯度估计的思想。数值实验展示了VTI在具有挑战性且可扩展至高基数模型空间问题上的性能。