Generalized Sliced Inverse Regression (GSIR) is one of the most important methods for nonlinear sufficient dimension reduction. As shown in Li and Song (2017), it enjoys a convergence rate that is independent of the dimension of the predictor, thus avoiding the curse of dimensionality. In this paper we establish an improved convergence rate of GSIR under additional mild eigenvalue decay rate and smoothness conditions. Our convergence rate can be made arbitrarily close to $n^{-1/3}$ under appropriate decay rate and smoothness parameters. As a comparison, the rate of Li and Song (2017) is $n^{-1/4}$ under the best conditions. This improvement is significant because, for example, in a semiparametric estimation problem involving an infinite-dimensional nuisance parameter, the convergence rate of the estimator of the nuisance parameter is often required to be faster than $n^{-1/4}$ to guarantee desired semiparametric properties such as asymptotic efficiency. This can be achieved by the improved convergence rate, but not by the original rate. The sharpened convergence rate can also be established for GSIR in more general settings, such as functional sufficient dimension reduction.
翻译:广义切片逆回归(GSIR)是非线性充分降维中最重要的方法之一。如Li和Song(2017)所示,该方法具有不依赖于预测变量维度的收敛率,从而避免了维数灾难。本文在附加的温和特征值衰减率与光滑性条件下,建立了GSIR的改进收敛率。在适当的衰减率与光滑性参数下,我们的收敛率可无限接近$n^{-1/3}$。作为对比,Li和Song(2017)在最优条件下的收敛率为$n^{-1/4}$。这一改进具有重要意义,例如在涉及无限维冗余参数的半参数估计问题中,通常要求冗余参数估计量的收敛率快于$n^{-1/4}$,才能保证渐近有效性等理想的半参数性质。改进后的收敛率可满足这一要求,而原始收敛率则无法实现。该锐化收敛率结论亦可推广至更一般的设定,如函数型充分降维中的GSIR方法。