This paper extends the universal approximation property of single-hidden-layer feedforward neural networks beyond compact domains, which is of particular interest for the approximation within weighted $C^k$-spaces and weighted Sobolev spaces over unbounded domains. More precisely, by assuming that the activation function is non-polynomial, we establish universal approximation results within function spaces defined over non-compact subsets of a Euclidean space, including $L^p$-spaces, weighted $C^k$-spaces, and weighted Sobolev spaces, where the latter two include the approximation of the (weak) derivatives. Moreover, we provide some dimension-independent rates for approximating a function with sufficiently regular and integrable Fourier transform by neural networks with non-polynomial activation function.
翻译:本文扩展了单隐层前馈神经网络在非紧域上的通用逼近性质,这对于无界域上的加权$C^k$空间和加权Sobolev空间内的逼近问题具有特殊意义。具体而言,通过假设激活函数为非多项式,我们在欧几里得空间的非紧子集上定义的函数空间中建立了通用逼近结果,包括$L^p$空间、加权$C^k$空间以及加权Sobolev空间,其中后两者涵盖了(弱)导数的逼近。此外,针对具有充分正则性和可积傅里叶变换的函数,我们给出了使用非多项式激活函数神经网络进行逼近的若干与维度无关的收敛速率。