In this paper we prove that rectified deep neural networks do not suffer from the curse of dimensionality when approximating McKean--Vlasov SDEs in the sense that the number of parameters in the deep neural networks only grows polynomially in the space dimension $d$ of the SDE and the reciprocal of the accuracy $\epsilon$.
翻译:本文证明,在逼近McKean--Vlasov随机微分方程(SDE)时,修正深度神经网络不会遭受维度灾难,其意义在于:深度神经网络的参数数量仅随SDE的空间维度$d$和精度倒数$\epsilon$呈多项式增长。