Mamba has recently emerged as a promising alternative to Transformers, offering near-linear complexity in processing sequential data. However, while channels in time series (TS) data have no specific order in general, recent studies have adopted Mamba to capture channel dependencies (CD) in TS, introducing a sequential order bias. To address this issue, we propose SOR-Mamba, a TS forecasting method that 1) incorporates a regularization strategy to minimize the discrepancy between two embedding vectors generated from data with reversed channel orders, thereby enhancing robustness to channel order, and 2) eliminates the 1D-convolution originally designed to capture local information in sequential data. Furthermore, we introduce channel correlation modeling (CCM), a pretraining task aimed at preserving correlations between channels from the data space to the latent space in order to enhance the ability to capture CD. Extensive experiments demonstrate the efficacy of the proposed method across standard and transfer learning scenarios. Code is available at https://github.com/seunghan96/SOR-Mamba.
翻译:Mamba最近已成为Transformer的一种有前景的替代方案,在处理序列数据时提供近乎线性的复杂度。然而,尽管时间序列数据中的通道通常没有特定的顺序,但最近的研究采用Mamba来捕捉时间序列中的通道依赖性,这引入了顺序偏差。为解决此问题,我们提出了SOR-Mamba,一种时间序列预测方法,该方法:1)结合了一种正则化策略,以最小化由通道顺序反转的数据生成的两个嵌入向量之间的差异,从而增强对通道顺序的鲁棒性;2)移除了原本为捕捉序列数据中局部信息而设计的一维卷积。此外,我们引入了通道相关性建模,这是一种预训练任务,旨在将通道间的相关性从数据空间保持到潜在空间,以增强捕捉通道依赖性的能力。大量实验证明了所提方法在标准场景和迁移学习场景中的有效性。代码可在https://github.com/seunghan96/SOR-Mamba获取。