We investigate the problem of sampling from posterior distributions with intractable normalizing constants in Bayesian inference. Our solution is a new generative modeling approach based on optimal transport (OT) that learns a deterministic map from a reference distribution to the target posterior through constrained optimization. The method uses structural constraints from OT theory to ensure uniqueness of the solution and allows efficient generation of many independent, high-quality posterior samples. The framework supports both continuous and mixed discrete-continuous parameter spaces, with specific adaptations for latent variable models and near-Gaussian posteriors. Beyond computational benefits, it also enables new inferential tools based on OT-derived multivariate ranks and quantiles for Bayesian exploratory analysis and visualization. We demonstrate the effectiveness of our approach through multiple simulation studies and a real-world data analysis.
翻译:本研究探讨了贝叶斯推断中具有难处理归一化常数的后验分布采样问题。我们提出了一种基于最优传输(OT)的新型生成建模方法,该方法通过约束优化学习从参考分布到目标后验的确定性映射。该方法利用OT理论中的结构约束来保证解的唯一性,并能够高效生成大量独立的高质量后验样本。该框架支持连续及混合离散-连续参数空间,特别针对潜变量模型和近高斯后验进行了适应性改进。除计算优势外,该方法还支持基于OT导出的多元秩和分位数的新型推断工具,可用于贝叶斯探索性分析与可视化。我们通过多项模拟研究和实际数据分析验证了该方法的有效性。