This work addresses two fundamental limitations in neural network approximation theory. We demonstrate that a three-dimensional network architecture enables a significantly more efficient representation of sawtooth functions, which serves as the cornerstone in the approximation of analytic and $L^p$ functions. First, we establish substantially improved exponential approximation rates for several important classes of analytic functions and offer a parameter-efficient network design. Second, for the first time, we derive a quantitative and non-asymptotic approximation of high orders for general $L^p$ functions. Our techniques advance the theoretical understanding of the neural network approximation in fundamental function spaces and offer a theoretically grounded pathway for designing more parameter-efficient networks.
翻译:本研究解决了神经网络逼近理论中的两个基本局限性。我们证明,一种三维网络架构能够显著更高效地表示锯齿函数,这构成了逼近解析函数与$L^p$函数的基石。首先,我们针对若干重要类别的解析函数建立了显著改进的指数逼近速率,并提供了一种参数高效的网络设计。其次,我们首次对一般$L^p$函数推导出了定量且非渐近的高阶逼近结果。我们的技术推进了对基本函数空间中神经网络逼近的理论理解,并为设计更具参数效率的网络提供了理论依据的路径。