Despite significant progress in time series forecasting, existing forecasters often overlook the heterogeneity between long-range and short-range time series, leading to performance degradation in practical applications. In this work, we highlight the need of distinct objectives tailored to different ranges. We point out that time series can be decomposed into global patterns and local variations, which should be addressed separately in long- and short-range time series. To meet the objectives, we propose a multi-scale hybrid Mamba-Transformer experts model State Space Transformer (SST). SST leverages Mamba as an expert to extract global patterns in coarse-grained long-range time series, and Local Window Transformer (LWT), the other expert to focus on capturing local variations in fine-grained short-range time series. With an input-dependent mechanism, State Space Model (SSM)-based Mamba is able to selectively retain long-term patterns and filter out fluctuations, while LWT employs a local window to enhance locality-awareness capability, thus effectively capturing local variations. To adaptively integrate the global patterns and local variations, a long-short router dynamically adjusts contributions of the two experts. SST achieves superior performance with scaling linearly $O(L)$ on time series length $L$. The comprehensive experiments demonstrate the SST can achieve SOTA results in long-short range time series forecasting while maintaining low memory footprint and computational cost. The code of SST is available at https://github.com/XiongxiaoXu/SST.
翻译:尽管时间序列预测已取得显著进展,但现有预测模型往往忽视长程与短程时间序列之间的异质性,导致实际应用中出现性能下降。本文强调需要针对不同范围设计差异化的学习目标。我们指出,时间序列可分解为全局模式与局部波动,二者在长程与短程序列中应分别处理。为此,我们提出一种多尺度混合Mamba-Transformer专家模型——状态空间变换器(SST)。SST利用Mamba作为专家提取粗粒度长程时间序列的全局模式,并采用局部窗口变换器(LWT)作为另一专家专注捕捉细粒度短程时间序列的局部波动。基于状态空间模型(SSM)的Mamba通过输入依赖机制能够选择性保留长期模式并滤除波动,而LWT采用局部窗口增强局部感知能力,从而有效捕捉局部变化。为自适应整合全局模式与局部波动,长短程路由器动态调节两位专家的贡献度。SST在时间序列长度$L$上实现线性复杂度$O(L)$的同时获得卓越性能。综合实验表明,SST能在保持低内存占用与计算成本的前提下,在长短程时间序列预测中达到最先进水平。SST代码已发布于https://github.com/XiongxiaoXu/SST。