The universal approximation property uniformly with respect to weakly compact families of measures is established for several classes of neural networks. To that end, we prove that these neural networks are dense in Orlicz spaces, thereby extending classical universal approximation theorems even beyond the traditional $L^p$-setting. The covered classes of neural networks include widely used architectures like feedforward neural networks with non-polynomial activation functions, deep narrow networks with ReLU activation functions and functional input neural networks.
翻译:针对若干类神经网络,建立了关于弱紧测度族的一致通用逼近性质。为此,我们证明了这些神经网络在Orlicz空间中稠密,从而将经典通用逼近定理推广至传统$L^p$空间框架之外。所涵盖的神经网络类别包括具有非多项式激活函数的前馈神经网络、采用ReLU激活函数的深窄网络以及函数输入神经网络等广泛使用的架构。