Time series forecasting is an important problem and plays a key role in a variety of applications including weather forecasting, stock market, and scientific simulations. Although transformers have proven to be effective in capturing dependency, its quadratic complexity of attention mechanism prevents its further adoption in long-range time series forecasting, thus limiting them attend to short-range range. Recent progress on state space models (SSMs) have shown impressive performance on modeling long range dependency due to their subquadratic complexity. Mamba, as a representative SSM, enjoys linear time complexity and has achieved strong scalability on tasks that requires scaling to long sequences, such as language, audio, and genomics. In this paper, we propose to leverage a hybrid framework Mambaformer that internally combines Mamba for long-range dependency, and Transformer for short range dependency, for long-short range forecasting. To the best of our knowledge, this is the first paper to combine Mamba and Transformer architecture in time series data. We investigate possible hybrid architectures to combine Mamba layer and attention layer for long-short range time series forecasting. The comparative study shows that the Mambaformer family can outperform Mamba and Transformer in long-short range time series forecasting problem. The code is available at https://github.com/XiongxiaoXu/Mambaformerin-Time-Series.
翻译:时间序列预测是一个重要问题,在天气预测、股票市场和科学模拟等多种应用中发挥着关键作用。尽管Transformer在捕捉依赖关系方面已被证明有效,但其注意力机制的二次复杂度阻碍了其在长时间序列预测中的进一步应用,因此使其局限于短程范围。状态空间模型(SSMs)的最新进展因其次二次复杂度,在建模长程依赖性方面展现出令人瞩目的性能。作为代表性SSM,Mamba具备线性时间复杂度,并在需要扩展至长序列的任务(如语言、音频和基因组学)中展现出强大的可扩展性。本文提出利用混合框架Mambaformer,其内部结合了用于长程依赖的Mamba和用于短程依赖的Transformer,以实现长短程预测。据我们所知,这是首次在时间序列数据中融合Mamba与Transformer架构的研究。我们探讨了混合Mamba层与注意力层的可能架构,用于长短程时间序列预测。对比研究表明,Mambaformer系列在长短程时间序列预测问题上能够优于Mamba和Transformer。代码已在 https://github.com/XiongxiaoXu/Mambaformerin-Time-Series 公开。