We introduce a novel class of untrained Recurrent Neural Networks (RNNs) within the Reservoir Computing (RC) paradigm, called Residual Reservoir Memory Networks (ResRMNs). ResRMN combines a linear memory reservoir with a non-linear reservoir, where the latter is based on residual orthogonal connections along the temporal dimension for enhanced long-term propagation of the input. The resulting reservoir state dynamics are studied through the lens of linear stability analysis, and we investigate diverse configurations for the temporal residual connections. The proposed approach is empirically assessed on time-series and pixel-level 1-D classification tasks. Our experimental results highlight the advantages of the proposed approach over other conventional RC models.
翻译:本文在储层计算范式下提出了一类新型的未经训练循环神经网络,称为残差储层记忆网络。该网络将线性记忆储层与非线性储层相结合,其中非线性储层基于沿时间维度的残差正交连接,以增强输入信号的长期传播能力。通过线性稳定性分析研究了所得储层状态的动态特性,并探讨了时间残差连接的多种配置方式。所提方法在时间序列及像素级一维分类任务上进行了实证评估。实验结果表明,该方法相较于传统储层计算模型具有显著优势。