Generative modeling builds on and substantially advances the classical idea of simulating synthetic data from observed samples. This paper shows that this principle is not only natural but also theoretically well-founded for bootstrap inference: it yields statistically valid confidence intervals that apply simultaneously to both regular and irregular estimators, including settings in which Efron's bootstrap fails. In this sense, the generative modeling-based bootstrap can be viewed as a modern version of the smoothed bootstrap: it could mitigate the curse of dimensionality and remain effective in challenging regimes where estimators may lack root-$n$ consistency or a Gaussian limit.
翻译:生成式建模基于并显著推进了从观测样本生成合成数据的经典思想。本文证明,这一原理不仅自然,而且在引导推断的理论基础上是完备的:它能够产生统计上有效的置信区间,同时适用于正则和非正则估计量,包括Efron引导法失效的场景。从这个意义上说,基于生成式建模的引导法可视为平滑引导法的现代版本:它能够缓解维度灾难,并在估计量可能缺乏根号n一致性或高斯极限的挑战性机制中保持有效性。