We study the universality of complex-valued neural networks with bounded widths and arbitrary depths. Under mild assumptions, we give a full description of those activation functions $\varrho:\mathbb{C}\to \mathbb{C}$ that have the property that their associated networks are universal, i.e., are capable of approximating continuous functions to arbitrary accuracy on compact domains. Precisely, we show that deep narrow complex-valued networks are universal if and only if their activation function is neither holomorphic, nor antiholomorphic, nor $\mathbb{R}$-affine. This is a much larger class of functions than in the dual setting of arbitrary width and fixed depth. Unlike in the real case, the sufficient width differs significantly depending on the considered activation function. We show that a width of $2n+2m+5$ is always sufficient and that in general a width of $max\{2n,2m\}$ is necessary. We prove, however, that a width of $n+m+3$ suffices for a rich subclass of the admissible activation functions. Here, $n$ and $m$ denote the input and output dimensions of the considered networks. Moreover, for the case of smooth and non-polyharmonic activation functions, we provide a quantitative approximation bound in terms of the depth of the considered networks.
翻译:本研究探讨了宽度有界、深度任意的复值神经网络的通用逼近性质。在温和假设下,我们完整描述了满足以下性质的激活函数 $\varrho:\mathbb{C}\to \mathbb{C}$:其对应的网络具有通用逼近能力,即能够在紧致域上以任意精度逼近连续函数。具体而言,我们证明深度窄复值神经网络具有通用逼近性当且仅当其激活函数既非全纯、亦非反全纯、亦非 $\mathbb{R}$-仿射函数。这比任意宽度固定深度情形下的函数类要大得多。与实值情形不同,充分宽度显著依赖于所考虑的激活函数。我们证明 $2n+2m+5$ 的宽度总是充分的,而一般而言 $max\{2n,2m\}$ 的宽度是必要的。然而,我们证明对于可容许激活函数的一个丰富子类,$n+m+3$ 的宽度即已足够。此处 $n$ 和 $m$ 分别表示所考虑网络的输入和输出维度。此外,对于光滑非多重调和激活函数的情形,我们给出了基于网络深度的定量逼近界。