In this short note, we give an elementary proof of a universal approximation theorem for neural networks with three hidden layers and increasing, continuous, bounded activation function. The result is weaker than the best known results, but the proof is elementary in the sense that no machinery beyond undergraduate analysis is used.
翻译:在这篇短文中,我们针对具有三个隐藏层、递增连续有界激活函数的神经网络,给出了一个通用逼近定理的初等证明。该结果弱于目前已知的最佳结果,但证明是初等的,仅使用了本科分析课程的知识,无需其他复杂工具。