We introduce forward-backward stochastic differential equations, highlighting the connection between solutions of these and solutions of partial differential equations, related by the Feynman-Kac theorem. We review the technique of approximating solutions to high dimensional partial differential equations using neural networks, and similarly approximate solutions of stochastic differential equations using multilevel Monte Carlo. Connecting the multilevel Monte Carlo method with the neural network framework using the setup established by E et al. and Raissi, we replicate many of their empirical results, and provide novel numerical analyses to produce strong error bounds for the specific framework of Raissi. Our results bound the overall strong error in terms of the maximum of the discretisation error and the neural network's approximation error. Our analyses are pivotal for applications of multilevel Monte Carlo, for which we propose suitable frameworks to exploit the variance structures of the multilevel estimators we elucidate. Also, focusing on the loss function advocated by Raissi, we expose the limitations of this, highlighting and quantifying its bias and variance. Lastly, we propose various avenues of further research which we anticipate should offer significant insight and speed improvements.
翻译:本文介绍了前向-后向随机微分方程,重点阐述了其解与偏微分方程解之间的关联性——这种关联由费曼-卡茨定理建立。我们系统回顾了利用神经网络逼近高维偏微分方程解的技术,并采用多层蒙特卡洛方法对随机微分方程解进行类似逼近。通过结合E等人与Raissi建立的框架,我们将多层蒙特卡洛方法与神经网络体系相融合,复现了他们的多项实证结果,并针对Raissi的具体框架提出了新的数值分析以构建强误差界。我们的研究结果将整体强误差界定为离散化误差与神经网络逼近误差最大值的函数。这些分析对多层蒙特卡洛的应用至关重要,为此我们提出了能够充分利用所阐明多层估计量方差结构的适用框架。此外,聚焦于Raissi倡导的损失函数,我们揭示了其局限性,特别量化分析了该函数的偏差与方差特性。最后,我们提出了多个未来研究方向,预计这些方向将带来重要的理论洞见与计算效率提升。