The total memory capacity (MC) of linear recurrent neural networks (RNNs) has been proven to be equal to the rank of the corresponding Kalman controllability matrix, and it is almost surely maximal for connectivity and input weight matrices drawn from regular distributions. This fact questions the usefulness of this metric in distinguishing the performance of linear RNNs in the processing of stochastic signals. This note shows that the MC of random nonlinear RNNs yields arbitrary values within established upper and lower bounds depending just on the input process scale. This confirms that the existing definition of MC in linear and nonlinear cases has no practical value.
翻译:线性循环神经网络(RNN)的总记忆容量(MC)已被证明等于相应卡尔曼可控性矩阵的秩,并且对于从正则分布中抽取的连接矩阵和输入权重矩阵,该容量几乎必然达到最大值。这一事实质疑了该度量在区分线性RNN处理随机信号性能时的实用性。本文指出,随机非线性RNN的MC可在已确定的上界与下界之间取得任意值,且仅取决于输入过程的尺度。这证实了现有线性和非线性情况下MC的定义缺乏实用价值。