In multivariate time series (MTS) forecasting, existing state-of-the-art deep learning approaches tend to focus on autoregressive formulations and overlook the information within exogenous indicators. To address this limitation, we present DeformTime, a neural network architecture that attempts to capture correlated temporal patterns from the input space, and hence, improve forecasting accuracy. It deploys two core operations performed by deformable attention blocks (DABs): learning dependencies across variables from different time steps (variable DAB), and preserving temporal dependencies in data from previous time steps (temporal DAB). Input data transformation is explicitly designed to enhance learning from the deformed series of information while passing through a DAB. We conduct extensive experiments on 6 MTS data sets, using previously established benchmarks as well as challenging infectious disease modelling tasks with more exogenous variables. The results demonstrate that DeformTime improves accuracy against previous competitive methods across the vast majority of MTS forecasting tasks, reducing the mean absolute error by 10% on average. Notably, performance gains remain consistent across longer forecasting horizons.
翻译:在多变量时间序列预测中,现有的先进深度学习方法往往侧重于自回归建模,而忽略了外生指标所包含的信息。为克服这一局限,本文提出DeformTime神经网络架构,该架构旨在从输入空间中捕捉相关的时序模式,从而提升预测精度。其核心通过可变形注意力块实现两项操作:学习跨变量在不同时间步的依赖关系(变量DAB),以及保持数据在历史时间步中的时序依赖关系(时序DAB)。模型显式设计了输入数据变换机制,以增强信息在通过DAB时对变形序列的学习能力。我们在6个多变量时间序列数据集上进行了广泛实验,既采用既有基准测试,也挑战了包含更多外生变量的传染病建模任务。实验结果表明,在绝大多数多变量时间序列预测任务中,DeformTime相较于以往竞争方法均提升了预测精度,平均绝对误差平均降低10%。值得注意的是,该模型在较长预测时间范围内仍能保持稳定的性能提升。