This work is devoted to the numerical approximation of high-dimensional advection-diffusion equations. It is well-known that classical methods, such as the finite volume method, suffer from the curse of dimensionality, and that their time step is constrained by a stability condition. The semi-Lagrangian method is known to overcome the stability issue, while recent time-discrete neural network-based approaches overcome the curse of dimensionality. In this work, we propose a novel neural semi-Lagrangian method that combines these last two approaches. It relies on projecting the initial condition onto a finite-dimensional neural space, and then solving an optimization problem, involving the backwards characteristic equation, at each time step. It is particularly well-suited for implementation on GPUs, as it is fully parallelizable and does not require a mesh. We provide rough error estimates present several high-dimensional numerical experiments to assess the performance of our approach, and compare it to other neural methods.
翻译:本文致力于高维对流扩散方程的数值逼近。众所周知,经典方法(如有限体积法)受制于维度灾难,且其时间步长受稳定性条件约束。半拉格朗日方法能够克服稳定性问题,而近期基于神经网络的离散时间方法则能克服维度灾难。本文提出一种新颖的神经半拉格朗日方法,将上述两种最新方法相结合。该方法首先将初始条件投影到有限维神经空间,随后在每个时间步求解涉及后向特征方程的优化问题。该方法尤其适合在GPU上实现,因其完全可并行化且无需网格。我们给出了粗略的误差估计,并通过多个高维数值实验评估所提方法的性能,同时与其他神经方法进行了比较。