In this study, we establish that deep neural networks employing ReLU and ReLU$^2$ activation functions are capable of representing Lagrange finite element functions of any order on simplicial meshes across arbitrary dimensions. We introduce a novel global formulation of the basis functions for Lagrange elements, grounded in a geometric decomposition of these elements and leveraging two essential properties of high-dimensional simplicial meshes and barycentric coordinate functions. This representation theory facilitates a natural approximation result for such deep neural networks. Our findings present the first demonstration of how deep neural networks can systematically generate general continuous piecewise polynomial functions.
翻译:本研究表明,采用ReLU和ReLU$^2$激活函数的深度神经网络能够表示任意维度单纯形网格上任意阶的拉格朗日有限元函数。我们提出了一种基于拉格朗日单元几何分解的基函数全局新公式,并利用了高维单纯形网格及重心坐标函数的两个基本性质。该表示理论为此类深度神经网络提供了自然的逼近结果。本文首次论证了深度神经网络如何系统性地生成一般的连续分段多项式函数。