Time series forecasting remains a critical challenge across numerous domains, yet the effectiveness of complex models often varies unpredictably across datasets. Recent studies highlight the surprising competitiveness of simple linear models, suggesting that their robustness and interpretability warrant deeper theoretical investigation. This paper presents a systematic study of linear models for time series forecasting, with a focus on the role of characteristic roots in temporal dynamics. We begin by analyzing the noise-free setting, where we show that characteristic roots govern long-term behavior and explain how design choices such as instance normalization and channel independence affect model capabilities. We then extend our analysis to the noisy regime, revealing that models tend to produce spurious roots. This leads to the identification of a key data-scaling property: mitigating the influence of noise requires disproportionately large training data, highlighting the need for structural regularization. To address these challenges, we propose two complementary strategies for robust root restructuring. The first uses rank reduction techniques, including Reduced-Rank Regression and Direct Weight Rank Reduction, to recover the low-dimensional latent dynamics. The second, a novel adaptive method called Root Purge, encourages the model to learn a noise-suppressing null space during training. Extensive experiments on standard benchmarks demonstrate the effectiveness of both approaches, validating our theoretical insights and achieving state-of-the-art results in several settings. Our findings underscore the potential of integrating classical theories for linear systems with modern learning techniques to build robust, interpretable, and data-efficient forecasting models.
翻译:时间序列预测在众多领域仍是关键挑战,然而复杂模型的有效性在不同数据集间常呈现不可预测的差异。近期研究指出简单线性模型具有出人意料的竞争力,表明其鲁棒性和可解释性值得更深入的理论探究。本文系统研究了用于时间序列预测的线性模型,重点关注特征根在时序动力学中的作用。我们首先分析无噪声场景,证明特征根主导长期行为,并阐释实例归一化与通道独立性等设计选择如何影响模型能力。随后将分析扩展至含噪声场景,揭示模型倾向于产生伪特征根的现象。由此发现关键的数据缩放特性:抑制噪声影响需要不成比例的大规模训练数据,这凸显了结构正则化的必要性。为应对这些挑战,我们提出两种互补的鲁棒性根重构策略。第一种采用秩约减技术,包括降秩回归与直接权重降秩法,以恢复低维潜在动力学特征。第二种是名为根清除的新型自适应方法,促使模型在训练过程中学习抑制噪声的零空间。在标准基准测试上的大量实验证明了两种方法的有效性,验证了我们的理论见解,并在多种设定下取得了最先进的预测结果。本研究结果凸显了将线性系统经典理论与现代学习技术相结合的巨大潜力,为构建鲁棒、可解释且数据高效的预测模型提供了新路径。