Irregularly sampled time series forecasting, characterized by non-uniform intervals, is prevalent in practical applications. However, previous research have been focused on regular time series forecasting, typically relying on transformer architectures. To extend transformers to handle irregular time series, we tackle the positional embedding which represents the temporal information of the data. We propose CTLPE, a method learning a continuous linear function for encoding temporal information. The two challenges of irregular time series, inconsistent observation patterns and irregular time gaps, are solved by learning a continuous-time function and concise representation of position. Additionally, the linear continuous function is empirically shown superior to other continuous functions by learning a neural controlled differential equation-based positional embedding, and theoretically supported with properties of ideal positional embedding. CTLPE outperforms existing techniques across various irregularly-sampled time series datasets, showcasing its enhanced efficacy.
翻译:不规则采样时间序列预测具有非均匀间隔的特征,在实际应用中十分普遍。然而,先前的研究主要集中于规则时间序列预测,通常依赖于Transformer架构。为了将Transformer扩展至处理不规则时间序列,我们重点研究了用于表示数据时序信息的位置嵌入方法。我们提出了CTLPE,一种学习连续线性函数以编码时序信息的方法。该方法通过学习一个连续时间函数和简洁的位置表示,解决了不规则时间序列的两个挑战:不一致的观测模式和不规则的时间间隔。此外,通过基于神经控制微分方程的位置嵌入学习,经验表明线性连续函数优于其他连续函数,并从理想位置嵌入的性质上获得了理论支持。CTLPE在多种不规则采样时间序列数据集上超越了现有技术,展现了其卓越的有效性。