Existing methods of vector autoregressive model for multivariate time series analysis make use of low-rank matrix approximation or Tucker decomposition to reduce the dimension of the over-parameterization issue. In this paper, we propose a sparse Tucker decomposition method with graph regularization for high-dimensional vector autoregressive time series. By stacking the time-series transition matrices into a third-order tensor, the sparse Tucker decomposition is employed to characterize important interactions within the transition third-order tensor and reduce the number of parameters. Moreover, the graph regularization is employed to measure the local consistency of the response, predictor and temporal factor matrices in the vector autoregressive model.The two proposed regularization techniques can be shown to more accurate parameters estimation. A non-asymptotic error bound of the estimator of the proposed method is established, which is lower than those of the existing matrix or tensor based methods. A proximal alternating linearized minimization algorithm is designed to solve the resulting model and its global convergence is established under very mild conditions. Extensive numerical experiments on synthetic data and real-world datasets are carried out to verify the superior performance of the proposed method over existing state-of-the-art methods.
翻译:现有的多元时间序列分析向量自回归模型方法利用低秩矩阵近似或Tucker分解来缓解过度参数化问题中的维度灾难。本文针对高维向量自回归时间序列,提出一种结合图正则化的稀疏Tucker分解方法。通过将时间序列转移矩阵堆叠为三阶张量,采用稀疏Tucker分解来刻画转移三阶张量内部的重要交互作用并减少参数数量。此外,图正则化被用于度量向量自回归模型中响应矩阵、预测矩阵和时间因子矩阵的局部一致性。所提出的两种正则化技术可证明能实现更精确的参数估计。本文建立了该方法估计量的非渐近误差界,该误差界低于现有基于矩阵或张量的方法。设计了一种邻近交替线性化最小化算法来求解所得模型,并在非常温和的条件下证明了其全局收敛性。在合成数据和真实数据集上进行的大量数值实验验证了所提方法相较于现有先进方法的优越性能。