Accurate household electricity short-term load forecasting (STLF) is key to future and sustainable energy systems. While various studies have analyzed statistical, machine learning, or deep learning approaches for household electricity STLF, recently proposed time series foundation models such as Chronos, TimesFM, or LagLlama have not yet been considered for household electricity STLF. These models are trained on a vast amount of time series data and are able to forecast time series without explicit task-specific training (zero-shot learning). In this study, we benchmark the forecasting capabilities of time series foundation models compared to Trained-from-Scratch (TFS) Transformer-based approaches. Our results suggest that foundation models perform comparably to TFS Transformer models, while the TimesFM foundation model outperforms all TFS models when the input size increases. At the same time, they require less effort, as they need no domain-specific training and only limited contextual data for inference.
翻译:精确的短期家庭电力负荷预测(STLF)是未来可持续能源系统的关键。尽管已有诸多研究分析了用于家庭电力STLF的统计、机器学习或深度学习方法,但近期提出的时间序列基础模型(如Chronos、TimesFM或LagLlama)尚未在家庭电力STLF中得到考量。这些模型在大量时间序列数据上进行训练,能够无需显式的任务特定训练(零样本学习)即可预测时间序列。在本研究中,我们基准测试了时间序列基础模型与从头训练(TFS)的基于Transformer的方法的预测能力。我们的结果表明,基础模型的性能与TFS Transformer模型相当,而当输入规模增大时,TimesFM基础模型的表现优于所有TFS模型。同时,它们所需的努力更少,因为其无需领域特定训练,且推理时仅需有限的上下文数据。