We investigate the approximation of functions $f$ on a bounded domain $\Omega\subset \mathbb{R}^d$ by the outputs of single-hidden-layer ReLU neural networks of width $n$. This form of nonlinear $n$-term dictionary approximation has been intensely studied since it is the simplest case of neural network approximation (NNA). There are several celebrated approximation results for this form of NNA that introduce novel model classes of functions on $\Omega$ whose approximation rates do not grow unbounded with the input dimension. These novel classes include Barron classes, and classes based on sparsity or variation such as the Radon-domain BV classes. The present paper is concerned with the definition of these novel model classes on domains $\Omega$. The current definition of these model classes does not depend on the domain $\Omega$. A new and more proper definition of model classes on domains is given by introducing the concept of weighted variation spaces. These new model classes are intrinsic to the domain itself. The importance of these new model classes is that they are strictly larger than the classical (domain-independent) classes. Yet, it is shown that they maintain the same NNA rates.
翻译:本文研究有界区域 $\Omega\subset \mathbb{R}^d$ 上函数 $f$ 通过宽度为 $n$ 的单隐层ReLU神经网络输出的逼近问题。这种形式的非线性 $n$ 项字典逼近因其作为神经网络逼近(NNA)的最简情形而受到广泛研究。针对该NNA形式存在若干著名的逼近结果,它们引入了$\Omega$上新型函数模型类,其逼近速率不随输入维度无限增长。这些新型类别包括Barron类,以及基于稀疏性或变差的类别(如Radon域BV类)。本文聚焦于这些新型模型类在区域 $\Omega$ 上的定义问题。现有模型类的定义不依赖于区域 $\Omega$。通过引入加权变差空间的概念,本文给出了区域上模型类更恰当的新定义。这些新模型类本质上是区域本身固有的。其重要性在于:它们严格包含经典(与区域无关的)模型类,同时被证明仍保持相同的NNA逼近速率。