We introduce a novel sufficient dimension-reduction (SDR) method which is robust against outliers using $\alpha$-distance covariance (dCov) in dimension-reduction problems. Under very mild conditions on the predictors, the central subspace is effectively estimated and model-free advantage without estimating link function based on the projection on the Stiefel manifold. We establish the convergence property of the proposed estimation under some regularity conditions. We compare the performance of our method with existing SDR methods by simulation and real data analysis and show that our algorithm improves the computational efficiency and effectiveness.
翻译:我们提出了一种新颖的充分降维方法,该方法利用$\alpha$-距离协方差处理降维问题中的异常值,具有较强的鲁棒性。在预测变量满足非常宽松的条件下,基于斯蒂弗尔流形上的投影,该方法无需估计连接函数即可有效估计中心子空间,具有无模型优势。我们在正则性条件下建立了所提估计的收敛性。通过模拟实验和实际数据分析,我们将该方法与现有充分降维方法进行了比较,结果表明我们的算法在计算效率和有效性上均有所提升。