Echo State Networks (ESNs) are a particular type of untrained Recurrent Neural Networks (RNNs) within the Reservoir Computing (RC) framework, popular for their fast and efficient learning. However, traditional ESNs often struggle with long-term information processing. In this paper, we introduce a novel class of deep untrained RNNs based on temporal residual connections, called Deep Residual Echo State Networks (DeepResESNs). We show that leveraging a hierarchy of untrained residual recurrent layers significantly boosts memory capacity and long-term temporal modeling. For the temporal residual connections, we consider different orthogonal configurations, including randomly generated and fixed-structure configurations, and we study their effect on network dynamics. A thorough mathematical analysis outlines necessary and sufficient conditions to ensure stable dynamics within DeepResESN. Our experiments on a variety of time series tasks showcase the advantages of the proposed approach over traditional shallow and deep RC.
翻译:回声状态网络(ESNs)是储层计算(RC)框架中一种特殊类型的未经训练的循环神经网络(RNNs),因其快速高效的学习能力而广受欢迎。然而,传统的ESN通常在长期信息处理方面存在困难。本文中,我们引入了一类基于时序残差连接的新型深度未经训练RNN,称为深度残差回声状态网络(DeepResESNs)。我们证明,利用未经训练的残差循环层层次结构能显著提升记忆容量和长期时序建模能力。对于时序残差连接,我们考虑了不同的正交配置,包括随机生成和固定结构的配置,并研究了它们对网络动态的影响。深入的数学分析概述了确保DeepResESN内稳定动态的充分必要条件。我们在多种时间序列任务上的实验展示了所提方法相较于传统浅层和深度RC的优势。