Predicting future values in multivariate time series is vital across various domains. This work explores the use of large language models (LLMs) for this task. However, LLMs typically handle one-dimensional data. We introduce MultiCast, a zero-shot LLM-based approach for multivariate time series forecasting. It allows LLMs to receive multivariate time series as input, through three novel token multiplexing solutions that effectively reduce dimensionality while preserving key repetitive patterns. Additionally, a quantization scheme helps LLMs to better learn these patterns, while significantly reducing token use for practical applications. We showcase the performance of our approach in terms of RMSE and execution time against state-of-the-art approaches on three real-world datasets.
翻译:多元时间序列的未来值预测在各领域均至关重要。本研究探索利用大语言模型(LLMs)完成此项任务。然而,大语言模型通常处理一维数据。我们提出MultiCast——一种基于大语言模型的零样本多元时间序列预测方法。该方法通过三种新颖的令牌复用方案,使大语言模型能够接收多元时间序列作为输入,在有效降维的同时保留关键的重复模式。此外,量化方案有助于大语言模型更好地学习这些模式,同时显著减少实际应用中的令牌使用量。我们在三个真实数据集上,以均方根误差和执行时间为指标,展示了本方法相较于前沿方法的性能表现。