Parallel-in-time (PinT) techniques have been proposed to solve systems of time-dependent differential equations by parallelizing the temporal domain. Among them, Parareal computes the solution sequentially using an inaccurate (fast) solver, and then "corrects" it using an accurate (slow) integrator that runs in parallel across temporal subintervals. This work introduces RandNet-Parareal, a novel method to learn the discrepancy between the coarse and fine solutions using random neural networks (RandNets). RandNet-Parareal achieves speed gains up to x125 and x22 compared to the fine solver run serially and Parareal, respectively. Beyond theoretical guarantees of RandNets as universal approximators, these models are quick to train, allowing the PinT solution of partial differential equations on a spatial mesh of up to $10^5$ points with minimal overhead, dramatically increasing the scalability of existing PinT approaches. RandNet-Parareal's numerical performance is illustrated on systems of real-world significance, such as the viscous Burgers' equation, the Diffusion-Reaction equation, the two- and three-dimensional Brusselator, and the shallow water equation.
翻译:时间并行技术通过将时间域并行化来求解时间依赖的微分方程组。其中,Parareal方法首先使用一个不精确(快速)的求解器顺序计算解,然后利用一个精确(慢速)的积分器在时间子区间上并行运行以对其进行“校正”。本文提出了RandNet-Parareal,这是一种利用随机神经网络学习粗解与精细解之间差异的新方法。与串行运行的精细求解器和Parareal方法相比,RandNet-Parareal分别实现了高达125倍和22倍的加速。除了随机神经网络作为通用逼近器的理论保证外,这些模型训练迅速,使得时间并行方法能够在空间网格高达$10^5$个点的条件下求解偏微分方程,且开销极小,从而显著提升了现有时间并行方法的可扩展性。RandNet-Parareal的数值性能在多个具有实际意义的系统中得到了验证,例如粘性Burgers方程、扩散-反应方程、二维和三维Brusselator模型以及浅水方程。