Deep nonparametric regression, characterized by the utilization of deep neural networks to learn target functions, has emerged as a focus of research attention in recent years. Despite considerable progress in understanding convergence rates, the absence of asymptotic properties hinders rigorous statistical inference. To address this gap, we propose a novel framework that transforms the deep estimation paradigm into a platform conducive to conditional mean estimation, leveraging the conditional diffusion model. Theoretically, we develop an end-to-end convergence rate for the conditional diffusion model and establish the asymptotic normality of the generated samples. Consequently, we are equipped to construct confidence regions, facilitating robust statistical inference. Furthermore, through numerical experiments, we empirically validate the efficacy of our proposed methodology.
翻译:深度非参数回归以利用深度神经网络学习目标函数为特征,近年来已成为研究关注的焦点。尽管在理解收敛速率方面取得了显著进展,但渐近性质的缺失阻碍了严格的统计推断。为填补这一空白,我们提出了一种新颖框架,该框架利用条件扩散模型将深度估计范式转化为有利于条件均值估计的平台。理论上,我们为条件扩散模型建立了端到端的收敛速率,并证明了生成样本的渐近正态性。因此,我们能够构建置信区域,从而促进稳健的统计推断。此外,通过数值实验,我们实证验证了所提方法的有效性。