Equivariant neural networks are neural networks with symmetry. Motivated by the theory of group representations, we decompose the layers of an equivariant neural network into simple representations. The nonlinear activation functions lead to interesting nonlinear equivariant maps between simple representations. For example, the rectified linear unit (ReLU) gives rise to piecewise linear maps. We show that these considerations lead to a filtration of equivariant neural networks, generalizing Fourier series. This observation might provide a useful tool for interpreting equivariant neural networks.
翻译:等变神经网络是具有对称性的神经网络。受群表示理论启发,我们将等变神经网络的各层分解为简单表示。非线性激活函数在简单表示之间产生了有趣的非线性等变映射。例如,整流线性单元(ReLU)会导出分段线性映射。我们证明这些考量将导致等变神经网络的滤结构,其推广了傅里叶级数。这一观察可能为解释等变神经网络提供有用的工具。