The \textit{Temporal Fusion Transformer} (TFT), proposed by Lim \textit{et al.}, published in \textit{International Journal of Forecasting} (2021), is a state-of-the-art attention-based deep neural network architecture specifically designed for multi-horizon time series forecasting. It has demonstrated significant performance improvements over existing benchmarks. In this work, we introduce the Quantum Temporal Fusion Transformer (QTFT), a quantum-enhanced hybrid quantum-classical architecture that extends the capabilities of the classical TFT framework. The core idea of this work is inspired by the foundation studies, \textit{The Power of Quantum Neural Networks} by Amira Abbas \textit{et al.} and \textit{Quantum Vision Transformers} by El Amine Cherrat \textit{et al.}, published in \textit{ Nature Computational Science} (2021) and \textit{Quantum} (2024), respectively. A key advantage of our approach lies in its foundation on a variational quantum algorithm, enabling implementation on current noisy intermediate-scale quantum (NISQ) devices without strict requirements on the number of qubits or circuit depth. Our results demonstrate that QTFT is successfully trained on the forecasting datasets and is capable of accurately predicting future values. In particular, our experimental results on two different datasets display that the model outperforms its classical counterpart in terms of both training and test loss. These results indicate the prospect of using quantum computing to boost deep learning architectures in complex machine learning tasks.
翻译:由Lim等人提出并发表于《国际预测杂志》(2021)的《时序融合Transformer》(TFT),是一种专为多水平时间序列预测设计的最先进的基于注意力机制的深度神经网络架构。相较于现有基准模型,它已展现出显著的性能提升。在本工作中,我们提出了量子时序融合Transformer(QTFT),这是一种量子增强的混合量子-经典架构,扩展了经典TFT框架的能力。本工作的核心思想受到基础研究的启发,即Amira Abbas等人发表于《自然·计算科学》(2021)的《量子神经网络的力量》以及El Amine Cherrat等人发表于《Quantum》(2024)的《量子视觉Transformer》。我们方法的一个关键优势在于其基于变分量子算法,使其能够在当前含噪声中等规模量子(NISQ)设备上实现,而对量子比特数或电路深度没有严格要求。我们的结果表明,QTFT在预测数据集上成功完成了训练,并能够准确预测未来值。特别地,我们在两个不同数据集上的实验结果显示,该模型在训练损失和测试损失方面均优于其经典对应模型。这些结果表明了利用量子计算在复杂机器学习任务中增强深度学习架构的前景。