Following the success of Transformer architectures in language modeling, particularly their ability to capture long-range dependencies, researchers have explored how these architectures can be adapted for time-series forecasting. Transformer-based models have been proposed to handle both short- and long-term dependencies when predicting future values from historical data. However, studies such as those by Zeng et al. (2022) and Rizvi et al. (2025) have reported mixed results in long-term forecasting tasks. In this work, we evaluate the Gaussian-based Linear architecture introduced by Rizvi et al. (2025) and present an enhanced version called the Residual Stacked Gaussian Linear (RSGL) model. We also investigate the broader applicability of the RSGL model in additional domains, including financial time series and epidemiological data. Experimental results show that the RSGL model achieves improved prediction accuracy and robustness compared to both the baseline Gaussian Linear and Transformer-based models.
翻译:继Transformer架构在语言建模领域取得成功后,特别是其捕捉长程依赖关系的能力,研究人员开始探索如何将这些架构应用于时间序列预测。基于Transformer的模型被提出用于处理从历史数据预测未来值时面临的短期与长期依赖关系。然而,如Zeng等人(2022年)和Rizvi等人(2025年)的研究所示,这类模型在长期预测任务中的表现存在差异。在本研究中,我们评估了Rizvi等人(2025年)提出的基于高斯的线性架构,并提出了一种增强版本,称为残差堆叠高斯线性(RSGL)模型。我们还探讨了RSGL模型在金融时间序列和流行病学数据等更多领域的适用性。实验结果表明,与基线高斯线性模型和基于Transformer的模型相比,RSGL模型在预测精度和鲁棒性方面均有提升。