Recurrent Neural Networks (RNNs) have revolutionized many areas of machine learning, particularly in natural language and data sequence processing. Long Short-Term Memory (LSTM) has demonstrated its ability to capture long-term dependencies in sequential data. Inspired by the Kolmogorov-Arnold Networks (KANs) a promising alternatives to Multi-Layer Perceptrons (MLPs), we proposed a new neural networks architecture inspired by KAN and the LSTM, the Temporal Kolomogorov-Arnold Networks (TKANs). TKANs combined the strenght of both networks, it is composed of Recurring Kolmogorov-Arnold Networks (RKANs) Layers embedding memory management. This innovation enables us to perform multi-step time series forecasting with enhanced accuracy and efficiency. By addressing the limitations of traditional models in handling complex sequential patterns, the TKAN architecture offers significant potential for advancements in fields requiring more than one step ahead forecasting.
翻译:循环神经网络(RNNs)已在机器学习诸多领域引发革命性变革,尤其在自然语言与数据序列处理方面。长短期记忆网络(LSTM)已证明其捕捉序列数据长期依赖关系的能力。受柯尔莫哥洛夫-阿诺德网络(KANs)作为多层感知机(MLPs)潜在替代方案的启发,我们结合KAN与LSTM的思想提出了一种新型神经网络架构——时序柯尔莫哥洛夫-阿诺德网络(TKANs)。TKAN融合了两类网络的优点,其核心由嵌入记忆管理机制的循环柯尔莫哥洛夫-阿诺德网络(RKANs)层构成。这一创新使我们能够以更高的精度和效率执行多步时间序列预测。通过克服传统模型在处理复杂序列模式时的局限性,TKAN架构为需要多步超前预测的领域提供了重要的技术推进潜力。