In this paper, we focus on fully connected deep neural networks utilizing the Rectified Linear Unit (ReLU) activation function for nonparametric estimation. We derive non-asymptotic bounds that lead to convergence rates, addressing both temporal and spatial dependence in the observed measurements. By accounting for dependencies across time and space, our models better reflect the complexities of real-world data, enhancing both predictive performance and theoretical robustness. We also tackle the curse of dimensionality by modeling the data on a manifold, exploring the intrinsic dimensionality of high-dimensional data. We broaden existing theoretical findings of temporal-spatial analysis by applying them to neural networks in more general contexts and demonstrate that our proof techniques are effective for models with short-range dependence. Our empirical simulations across various synthetic response functions underscore the superior performance of our method, outperforming established approaches in the existing literature. These findings provide valuable insights into the strong capabilities of dense neural networks for temporal-spatial modeling across a broad range of function classes.
翻译:本文聚焦于采用修正线性单元(ReLU)激活函数的全连接深度神经网络在非参数估计中的应用。我们推导出可导出收敛速率的非渐近界,以处理观测数据中的时间与空间依赖性。通过考虑跨时间与空间的依赖关系,我们的模型能更好地反映现实世界数据的复杂性,从而提升预测性能与理论鲁棒性。我们还通过将数据建模在流形上来应对维度灾难问题,探索高维数据的内在维度。我们将时空分析的现有理论成果拓展至更普遍场景下的神经网络应用,并证明我们的证明技术对具有短程依赖的模型具有有效性。我们在多种合成响应函数上的实证模拟凸显了本方法的优越性能,其表现超越了现有文献中的成熟方法。这些发现为密集神经网络在广泛函数类别中处理时空建模任务的强大能力提供了重要见解。