Time series forecasting always faces the challenge of concept drift, where data distributions evolve over time, leading to a decline in forecast model performance. Existing solutions are based on online learning, which continually organize recent time series observations as new training samples and update model parameters according to the forecasting feedback on recent data. However, they overlook a critical issue: obtaining ground-truth future values of each sample should be delayed until after the forecast horizon. This delay creates a temporal gap between the training samples and the test sample. Our empirical analysis reveals that the gap can introduce concept drift, causing forecast models to adapt to outdated concepts. In this paper, we present \textsc{Proceed}, a novel proactive model adaptation framework for online time series forecasting. \textsc{Proceed} first estimates the concept drift between the recently used training samples and the current test sample. It then employs an adaptation generator to efficiently translate the estimated drift into parameter adjustments, proactively adapting the model to the test sample. To enhance the generalization capability of the framework, \textsc{Proceed} is trained on synthetic diverse concept drifts. Extensive experiments on five real-world datasets across various forecast models demonstrate that \textsc{Proceed} brings more performance improvements than the state-of-the-art online learning methods, significantly facilitating forecast models' resilience against concept drifts. Code is available at \url{https://github.com/SJTU-DMTai/OnlineTSF}.
翻译:时间序列预测始终面临概念漂移的挑战,即数据分布随时间演变导致预测模型性能下降。现有解决方案基于在线学习,持续将近期时间序列观测值组织为新的训练样本,并根据对近期数据的预测反馈更新模型参数。然而,这些方法忽视了一个关键问题:获取每个样本的真实未来值需延迟至预测时域之后。这种延迟在训练样本与测试样本之间形成了时间间隙。我们的实证分析表明,该间隙可能引入概念漂移,导致预测模型适应过时的概念。本文提出 \textsc{Proceed}——一种用于在线时间序列预测的新型前瞻性模型自适应框架。\textsc{Proceed} 首先估计近期所用训练样本与当前测试样本之间的概念漂移,随后采用自适应生成器将估计的漂移高效转化为参数调整,使模型能前瞻性地适应测试样本。为增强框架的泛化能力,\textsc{Proceed} 在合成的多样化概念漂移上进行训练。在五个真实数据集上对多种预测模型开展的广泛实验表明,相较于最先进的在线学习方法,\textsc{Proceed} 能带来更显著的性能提升,极大增强了预测模型对概念漂移的适应能力。代码发布于 \url{https://github.com/SJTU-DMTai/OnlineTSF}。