The time-series forecasting (TSF) problem is a traditional problem in the field of artificial intelligence. Models such as Recurrent Neural Network (RNN), Long Short Term Memory (LSTM), and GRU (Gate Recurrent Units) have contributed to improving the predictive accuracy of TSF. Furthermore, model structures have been proposed to combine time-series decomposition methods, such as seasonal-trend decomposition using Loess (STL) to ensure improved predictive accuracy. However, because this approach is learned in an independent model for each component, it cannot learn the relationships between time-series components. In this study, we propose a new neural architecture called a correlation recurrent unit (CRU) that can perform time series decomposition within a neural cell and learn correlations (autocorrelation and correlation) between each decomposition component. The proposed neural architecture was evaluated through comparative experiments with previous studies using five univariate time-series datasets and four multivariate time-series data. The results showed that long- and short-term predictive performance was improved by more than 10%. The experimental results show that the proposed CRU is an excellent method for TSF problems compared to other neural architectures.
翻译:时间序列预测(TSF)问题是人工智能领域的经典课题。循环神经网络(RNN)、长短期记忆网络(LSTM)和门控循环单元(GRU)等模型为提升TSF的预测精度做出了贡献。此外,已有研究提出将时间序列分解方法(例如基于Loess的季节性趋势分解方法STL)与模型结构相结合,以期进一步提高预测精度。然而,由于该方法需为每个分解分量建立独立模型进行学习,无法捕捉时间序列各分量之间的关联关系。本研究提出一种称为相关性循环单元(CRU)的新型神经网络架构,该架构能够在单一神经单元内完成时间序列分解,并学习各分解分量之间的相关性(自相关与互相关)。通过使用五个单变量时间序列数据集和四个多变量时间序列数据集,将所提神经网络架构与已有研究进行对比实验评估。结果表明,其在长期与短期预测任务中的性能提升均超过10%。实验证明,相较于其他神经网络架构,本文提出的CRU是解决TSF问题的优异方法。