We explored the mathematical foundations of Recurrent Neural Networks ($\mathtt{RNN}$s) and three fundamental procedures: temporal rescaling, discretisation and linearisation. These techniques provide essential tools for characterizing $\mathtt{RNN}$s behaviour, enabling insights into temporal dynamics, practical computational implementation, and linear approximations for analysis. We discuss the flexible order of application of these procedures, emphasizing their significance in modelling and analyzing $\mathtt{RNN}$s for neuroscience and machine learning applications. We explicitly describe here under what conditions these procedures can be interchangeable.
翻译:我们探讨了循环神经网络($\mathtt{RNN}$s)的数学基础及其三种基本操作:时间尺度变换、离散化与线性化。这些技术为刻画$\mathtt{RNN}$s的行为提供了关键工具,有助于深入理解其时间动态特性、实现实际计算部署以及构建用于分析的线性近似模型。本文讨论了这些操作应用顺序的灵活性,并强调了它们在神经科学与机器学习应用中建模和分析$\mathtt{RNN}$s的重要意义。我们在此明确描述了这些操作在何种条件下可以互换。