In this paper, we focus on fully connected deep neural networks utilizing the Rectified Linear Unit (ReLU) activation function for nonparametric estimation. We derive non-asymptotic bounds that lead to convergence rates, addressing both temporal and spatial dependence in the observed measurements. By accounting for dependencies across time and space, our models better reflect the complexities of real-world data, enhancing both predictive performance and theoretical robustness. We also tackle the curse of dimensionality by modeling the data on a manifold, exploring the intrinsic dimensionality of high-dimensional data. We broaden existing theoretical findings of temporal-spatial analysis by applying them to neural networks in more general contexts and demonstrate that our proof techniques are effective for models with short-range dependence. Our empirical simulations across various synthetic response functions underscore the superior performance of our method, outperforming established approaches in the existing literature. These findings provide valuable insights into the strong capabilities of dense neural networks for temporal-spatial modeling across a broad range of function classes.
翻译:本文聚焦于利用修正线性单元(ReLU)激活函数的全连接深度神经网络进行非参数估计。我们推导了非渐近界,从而得到收敛速率,同时处理了观测数据中的时间与空间依赖性。通过考虑跨时间和空间的依赖关系,我们的模型能更好地反映现实世界数据的复杂性,从而提升预测性能与理论鲁棒性。我们还通过在流形上对数据进行建模来应对维度灾难,探索高维数据的内在维度。我们将时空分析的现有理论成果推广至更一般场景下的神经网络应用,并证明我们的证明技术对具有短程依赖的模型是有效的。我们在多种合成响应函数上的实证模拟突显了本方法的优越性能,其表现超越了现有文献中的成熟方法。这些发现为密集神经网络在广泛函数类中进行时空建模的强大能力提供了有价值的见解。