We consider the problem of sufficient dimension reduction (SDR) for multi-index models. The estimators of the central mean subspace in prior works either have slow (non-parametric) convergence rates, or rely on stringent distributional conditions (e.g., the covariate distribution $P_{\mathbf{X}}$ being elliptical symmetric). In this paper, we show that a fast parametric convergence rate of form $C_d \cdot n^{-1/2}$ is achievable via estimating the \emph{expected smoothed gradient outer product}, for a general class of distribution $P_{\mathbf{X}}$ admitting Gaussian or heavier distributions. When the link function is a polynomial with a degree of at most $r$ and $P_{\mathbf{X}}$ is the standard Gaussian, we show that the prefactor depends on the ambient dimension $d$ as $C_d \propto d^r$.
翻译:本文研究多指标模型的充分降维问题。现有中心均值子空间估计方法或具有缓慢(非参数)收敛速率,或依赖于严格的分布假设(例如协变量分布$P_{\mathbf{X}}$需满足椭圆对称性)。本文证明,对于允许高斯分布或更重尾分布的一般分布类别$P_{\mathbf{X}}$,通过估计\emph{期望平滑梯度外积}可实现$C_d \cdot n^{-1/2}$形式的快速参数收敛速率。当链接函数为次数不超过$r$的多项式且$P_{\mathbf{X}}$为标准高斯分布时,我们证明前置因子随环境维度$d$的变化满足$C_d \propto d^r$。