Neural Forecasters (NFs) are a cornerstone of Long-term Time Series Forecasting (LTSF). However, progress has been hampered by an overemphasis on architectural complexity at the expense of fundamental forecasting principles. In this work, we return to first principles to redesign the LTSF paradigm. We begin by introducing a Multiple Neural Forecasting Theorem that provides a theoretical basis for our approach. We propose Boosted Direct Output (BDO), a novel forecasting strategy that synergistically combines the advantages of both Auto-Regressive (AR) and Direct Output (DO). In addition, we stabilize the learning process by smoothly tracking the model's parameters. Extensive experiments show that these principled improvements enable a simple MLP to achieve state-of-the-art performance, outperforming recent, complex models in nearly all cases, without any specific considerations in the area. Finally, we empirically verify our theorem, establishing a dynamic performance bound and identifying promising directions for future research. The code for review is available at: .
翻译:神经预测器(NFs)是长期时间序列预测(LTSF)的基石。然而,该领域的进展因过度强调架构复杂性而牺牲基本预测原则受到阻碍。在本工作中,我们回归第一性原理以重新设计LTSF范式。我们首先引入一个多重神经预测定理,为我们的方法提供理论基础。我们提出增强直接输出(BDO),这是一种新颖的预测策略,能协同结合自回归(AR)和直接输出(DO)两者的优势。此外,我们通过平滑跟踪模型参数来稳定学习过程。大量实验表明,这些基于原理的改进使得一个简单的多层感知机(MLP)能够实现最先进的性能,在几乎所有情况下都优于近期复杂的模型,且无需在该领域进行任何特定考量。最后,我们通过实验验证了我们的定理,建立了一个动态性能边界,并指出了未来研究的有前景方向。用于审阅的代码位于:。