We prove that multilevel Picard approximations and deep neural networks with ReLU, leaky ReLU, and softplus activation are capable of approximating solutions of semilinear Kolmogorov PDEs in $L^\mathfrak{p}$-sense, $\mathfrak{p}\in [2,\infty)$, in the case of gradient-independent, Lipschitz-continuous nonlinearities, while the computational effort of the multilevel Picard approximations and the required number of parameters in the neural networks grow at most polynomially in both dimension $d\in \mathbb{N}$ and reciprocal of the prescribed accuracy $\epsilon$.
翻译:我们证明,在梯度无关且满足Lipschitz连续的非线性条件下,多层Picard逼近方法与具有ReLU、Leaky ReLU及Softplus激活函数的深度神经网络能够在$L^\mathfrak{p}$意义下($\mathfrak{p}\in [2,\infty)$)逼近半线性Kolmogorov偏微分方程的解,且多层Picard逼近的计算复杂度与神经网络所需参数数量在维度$d\in \mathbb{N}$和精度倒数$1/\epsilon$上至多以多项式速率增长。