Long-term time series forecasting (LTSF) is important for various domains but is confronted by challenges in handling the complex temporal-contextual relationships. As multivariate input models underperforming some recent univariate counterparts, we posit that the issue lies in the inefficiency of existing multivariate LTSF Transformers to model series-wise relationships: the characteristic differences between series are often captured incorrectly. To address this, we introduce ARM: a multivariate temporal-contextual adaptive learning method, which is an enhanced architecture specifically designed for multivariate LTSF modelling. ARM employs Adaptive Univariate Effect Learning (AUEL), Random Dropping (RD) training strategy, and Multi-kernel Local Smoothing (MKLS), to better handle individual series temporal patterns and correctly learn inter-series dependencies. ARM demonstrates superior performance on multiple benchmarks without significantly increasing computational costs compared to vanilla Transformer, thereby advancing the state-of-the-art in LTSF. ARM is also generally applicable to other LTSF architecture beyond vanilla Transformer.
翻译:长期时间序列预测(LTSF)在众多领域中具有重要意义,但其在处理复杂的时空上下文关系方面仍面临挑战。针对当前多元输入模型表现不及部分近期单变量模型的现象,我们认为问题根源在于现有基于Transformer的多元LTSF模型在建模序列间关系时存在效率不足:模型往往无法准确捕捉不同序列间的特征差异。为此,我们提出ARM——一种多元时空上下文自适应学习方法,这是一种专门为多元LTSF建模设计的增强架构。ARM通过自适应单变量效应学习(AUEL)、随机丢弃(RD)训练策略以及多核局部平滑(MKLS)技术,能够更好地处理各序列的时序模式并准确学习序列间的依赖关系。实验表明,相较于原始Transformer,ARM在多个基准数据集上展现出更优越的性能,且未显著增加计算成本,从而推动了LTSF领域的技术前沿。此外,ARM同样可推广应用于原始Transformer之外的其他LTSF架构。