This article considers a novel and widely applicable approach to modeling high-dimensional dependent data when a large number of explanatory variables are available and the signal-to-noise ratio is low. We postulate that a $p$-dimensional response series is the sum of a linear regression with many observable explanatory variables and an error term driven by some latent common factors and an idiosyncratic noise. The common factors have dynamic dependence whereas the covariance matrix of the idiosyncratic noise can have diverging eigenvalues to handle the situation of low signal-to-noise ratio commonly encountered in applications. The regression coefficient matrix is estimated using penalized methods when the dimensions involved are high. We apply factor modeling to the regression residuals, employ a high-dimensional white noise testing procedure to determine the number of common factors, and adopt a projected Principal Component Analysis when the signal-to-noise ratio is low. We establish asymptotic properties of the proposed method, both for fixed and diverging numbers of regressors, as $p$ and the sample size $T$ approach infinity. Finally, we use simulations and empirical applications to demonstrate the efficacy of the proposed approach in finite samples.
翻译:本文提出了一种新颖且广泛适用的方法,用于在存在大量解释变量且信噪比较低的情况下对高维相依数据进行建模。我们假设一个$p$维响应序列是由一个包含众多可观测解释变量的线性回归项,以及一个由若干潜在公共因子和特异噪声驱动的误差项之和构成。公共因子具有动态相依性,而特异噪声的协方差矩阵可具有发散的特征值,以处理应用中常见的低信噪比情形。当涉及维度较高时,我们采用惩罚方法估计回归系数矩阵。我们对回归残差应用因子建模,采用高维白噪声检验程序确定公共因子的数量,并在信噪比较低时采用投影主成分分析。我们建立了所提方法的渐近性质,涵盖回归变量数量固定和发散两种情况,其中$p$与样本量$T$趋于无穷。最后,我们通过模拟和实证应用展示了所提方法在有限样本中的有效性。