This paper is concerned with the fundamental limits of nonlinear dynamical system learning from input-output traces. Specifically, we show that recurrent neural networks (RNNs) are capable of learning nonlinear systems that satisfy a Lipschitz property and forget past inputs fast enough in a metric-entropy optimal manner. As the sets of sequence-to-sequence maps realized by the dynamical systems we consider are significantly more massive than function classes generally considered in deep neural network approximation theory, a refined metric-entropy characterization is needed, namely in terms of order, type, and generalized dimension. We compute these quantities for the classes of exponentially-decaying and polynomially-decaying Lipschitz fading-memory systems and show that RNNs can achieve them.
翻译:本文研究从输入-输出轨迹学习非线性动力系统的基本极限。具体而言,我们证明了循环神经网络(RNNs)能够以度量熵最优的方式学习满足Lipschitz性质且对过往输入具有足够快速遗忘特性的非线性系统。由于我们所考虑的动力系统所实现的序列到序列映射集合,其规模远超深度神经网络逼近理论中通常考虑的函数类,因此需要基于阶数、类型和广义维度的精细化度量熵刻画。我们计算了指数衰减和多项式衰减Lipschitz渐消记忆系统类的这些度量熵量,并证明循环神经网络能够达到这些极限。