We study the geometric properties of random neural networks by investigating the boundary volumes of their excursion sets for different activation functions, as the depth increases. More specifically, we show that, for activations which are not very regular (e.g., the Heaviside step function), the boundary volumes exhibit fractal behavior, with their Hausdorff dimension monotonically increasing with the depth. On the other hand, for activations which are more regular (e.g., ReLU, logistic and $\tanh$), as the depth increases, the expected boundary volumes can either converge to zero, remain constant or diverge exponentially, depending on a single spectral parameter which can be easily computed. Our theoretical results are confirmed in some numerical experiments based on Monte Carlo simulations.
翻译:本文通过研究不同激活函数下随机神经网络在深度增加时其偏移集边界体积的变化,探讨了随机神经网络的几何特性。具体而言,我们发现,对于非高度正则化的激活函数(例如Heaviside阶跃函数),边界体积呈现出分形行为,其Hausdorff维度随深度单调递增。另一方面,对于更为正则化的激活函数(例如ReLU、logistic和$\tanh$),随着深度增加,期望边界体积可能收敛至零、保持恒定或呈指数级发散,具体取决于一个易于计算的单一谱参数。我们的理论结果在基于蒙特卡洛模拟的数值实验中得到了验证。