From the perspective of control theory, convolutional layers (of neural networks) are 2-D (or N-D) linear time-invariant dynamical systems. The usual representation of convolutional layers by the convolution kernel corresponds to the representation of a dynamical system by its impulse response. However, many analysis tools from control theory, e.g., involving linear matrix inequalities, require a state space representation. For this reason, we explicitly provide a state space representation of the Roesser type for 2-D convolutional layers with $c_\mathrm{in}r_1 + c_\mathrm{out}r_2$ states, where $c_\mathrm{in}$/$c_\mathrm{out}$ is the number of input/output channels of the layer and $r_1$/$r_2$ characterizes the width/length of the convolution kernel. This representation is shown to be minimal for $c_\mathrm{in} = c_\mathrm{out}$. We further construct state space representations for dilated, strided, and N-D convolutions.
翻译:从控制理论的角度来看,(神经网络的)卷积层是二维(或N维)线性时不变动态系统。通常用卷积核表示卷积层的方式,对应着用脉冲响应表示动态系统的方法。然而,控制理论中的许多分析工具(例如涉及线性矩阵不等式的工具)需要状态空间表示。为此,我们为二维卷积层显式地构建了Roesser型状态空间表示,其状态数为$c_\mathrm{in}r_1 + c_\mathrm{out}r_2$,其中$c_\mathrm{in}$/$c_\mathrm{out}$表示该层的输入/输出通道数,$r_1$/$r_2$表征卷积核的宽度/长度。该表示在$c_\mathrm{in} = c_\mathrm{out}$条件下被证明是最小化的。我们进一步构建了扩张卷积、跨步卷积及N维卷积的状态空间表示。