In this paper, we propose a novel approach that enhances recurrent neural networks (RNNs) by incorporating path signatures into their gating mechanisms. Our method modifies both Long Short-Term Memory (LSTM) and Gated Recurrent Unit (GRU) architectures by replacing their forget and reset gates, respectively, with learnable path signatures. These signatures, which capture the geometric features of the entire path history, provide a richer context for controlling information flow through the network's memory. This modification allows the networks to make memory decisions based on the full historical context rather than just the current input and state. Through experimental studies, we demonstrate that our Signature-LSTM (SigLSTM) and Signature-GRU (SigGRU) models outperform their traditional counterparts across various sequential learning tasks. By leveraging path signatures in recurrent architectures, this method offers new opportunities to enhance performance in time series analysis and forecasting applications.
翻译:本文提出一种新颖方法,通过将路径签名融入门控机制来增强循环神经网络(RNNs)。该方法分别用可学习的路径签名替换长短期记忆网络(LSTM)的门控单元和门控循环单元(GRU)的重置门。这些签名能捕捉完整路径历史的几何特征,为控制网络记忆中的信息流提供了更丰富的上下文。这种改进使得网络能够基于完整的历史上下文而非仅当前输入和状态来做出记忆决策。通过实验研究,我们证明所提出的签名LSTM(SigLSTM)和签名GRU(SigGRU)模型在多种序列学习任务中均优于传统对应模型。通过在循环架构中利用路径签名,该方法为提升时间序列分析与预测应用的性能提供了新的可能性。