Covariate-dependent uncertainty quantification in simulation-based inference is crucial for high-stakes decision-making but remains challenging due to the limitations of existing methods such as conformal prediction and classical bootstrap, which struggle with covariate-specific conditioning. We propose Efficient Quantile-Regression-Based Generative Metamodeling (E-QRGMM), a novel framework that accelerates the quantile-regression-based generative metamodeling (QRGMM) approach by integrating cubic Hermite interpolation with gradient estimation. Theoretically, we show that E-QRGMM preserves the convergence rate of the original QRGMM while reducing grid complexity from $O(n^{1/2})$ to $O(n^{1/5})$ for the majority of quantile levels, thereby substantially improving computational efficiency. Empirically, E-QRGMM achieves a superior trade-off between distributional accuracy and training speed compared to both QRGMM and other advanced deep generative models on synthetic and practical datasets. Moreover, by enabling bootstrap-based construction of confidence intervals for arbitrary estimands of interest, E-QRGMM provides a practical solution for covariate-dependent uncertainty quantification.
翻译:基于仿真的推理中,协变量依赖的不确定性量化对于高风险决策至关重要,但由于现有方法(如保形预测和经典自助法)在处理协变量特定条件时存在局限,该问题仍具挑战。我们提出高效分位数回归生成式元建模(E-QRGMM),该新颖框架通过将三次埃尔米特插值与梯度估计相结合,加速了基于分位数回归的生成式元建模(QRGMM)方法。理论上,我们证明E-QRGMM保持了原始QRGMM的收敛速率,同时将大多数分位数水平下的网格复杂度从$O(n^{1/2})$降低至$O(n^{1/5})$,从而显著提升了计算效率。实证上,在合成与实际数据集上,相较于QRGMM及其他先进深度生成模型,E-QRGMM在分布准确性与训练速度之间实现了更优的权衡。此外,通过支持基于自助法构建任意目标估计量的置信区间,E-QRGMM为协变量依赖的不确定性量化提供了一种实用解决方案。