It is well-known that randomly initialized, push-forward, fully-connected neural networks weakly converge to isotropic Gaussian processes, in the limit where the width of all layers goes to infinity. In this paper, we propose to use the angular power spectrum of the limiting field to characterize the complexity of the network architecture. In particular, we define sequences of random variables associated with the angular power spectrum, and provide a full characterization of the network complexity in terms of the asymptotic distribution of these sequences as the depth diverges. On this basis, we classify neural networks as low-disorder, sparse, or high-disorder; we show how this classification highlights a number of distinct features for standard activation functions, and in particular, sparsity properties of ReLU networks. Our theoretical results are also validated by numerical simulations.
翻译:众所周知,在全连接层宽度均趋于无穷的极限条件下,随机初始化前馈全连接神经网络弱收敛于各向同性高斯过程。本文提出利用极限场的角功率谱来刻画网络架构的复杂度。具体而言,我们定义了与角功率谱相关的随机变量序列,并在深度发散时基于该序列的渐近分布对网络复杂度进行了完整表征。在此基础上,我们将神经网络分为低无序、稀疏和高无序三类;研究表明,该分类凸显了标准激活函数的若干显著特征,特别是ReLU网络的稀疏性特性。我们的理论结果也通过数值仿真得到了验证。