Accurate household electricity short-term load forecasting (STLF) is key to future and sustainable energy systems. While various studies have analyzed statistical, machine learning, or deep learning approaches for household electricity STLF, recently proposed time series foundation models such as Chronos, TimesFM or Time-MoE promise a new approach for household electricity STLF. These models are trained on a vast amount of time series data and are able to forecast time series without explicit task-specific training (zero-shot learning). In this study, we benchmark the forecasting capabilities of time series foundation models compared to Trained-from-Scratch (TFS) Transformer-based approaches. Our results suggest that foundation models perform comparably to TFS Transformer models, while certain time series foundation models outperform all TFS models when the input size increases. At the same time, they require less effort, as they need no domain-specific training and only limited contextual data for inference.
翻译:准确的短期家庭电力负荷预测是构建未来可持续能源系统的关键。尽管已有诸多研究分析了用于家庭电力短期负荷预测的统计方法、机器学习或深度学习方法,但近期提出的时间序列基础模型,如Chronos、TimesFM或Time-MoE,为家庭电力短期负荷预测提供了一种新途径。这些模型在大量时间序列数据上进行训练,能够无需显式的任务特定训练(零样本学习)即可预测时间序列。在本研究中,我们基准测试了时间序列基础模型与从头开始训练的基于Transformer的方法的预测能力。我们的结果表明,基础模型的性能与从头开始训练的Transformer模型相当,而当输入规模增大时,某些时间序列基础模型的表现优于所有从头开始训练的模型。同时,它们所需的努力更少,因为无需领域特定训练,且推理时仅需有限的上下文数据。