We study the large-width asymptotics of random fully connected neural networks with weights drawn from $\alpha$-stable distributions, a family of heavy-tailed distributions arising as the limiting distributions in the Gnedenko-Kolmogorov heavy-tailed central limit theorem. We show that in an arbitrary bounded Euclidean domain $\mathcal{U}$ with smooth boundary, the random field at the infinite-width limit, characterized in previous literature in terms of finite-dimensional distributions, has sample functions in the fractional Sobolev-Slobodeckij-type quasi-Banach function space $W^{s,p}(\mathcal{U})$ for integrability indices $p < \alpha$ and suitable smoothness indices $s$ depending on the activation function of the neural network, and establish the functional convergence of the processes in $\mathcal{P}(W^{s,p}(\mathcal{U}))$. This convergence result is leveraged in the study of functional posteriors for edge-preserving Bayesian inverse problems with stable neural network priors.
翻译:我们研究了权重服从 $\alpha$-稳定分布的随机全连接神经网络的大宽度渐近行为,该分布族作为Gnedenko-Kolmogorov重尾中心极限定理中的极限分布出现。我们证明,在任意具有光滑边界的欧几里得有界域 $\mathcal{U}$ 中,先前文献通过有限维分布刻画的无限宽度极限下的随机场,其样本函数属于分数阶Sobolev-Slobodeckij型拟巴拿赫函数空间 $W^{s,p}(\mathcal{U})$,其中可积性指数 $p < \alpha$,光滑性指数 $s$ 取决于神经网络的激活函数,并建立了该过程在 $\mathcal{P}(W^{s,p}(\mathcal{U}))$ 中的泛函收敛性。这一收敛结果被应用于研究具有稳定神经网络先验的边缘保持贝叶斯反问题的泛函后验分布。