We develop and evaluate a method for learning solution operators to nonlinear problems governed by partial differential equations. The approach is based on a finite element discretization and aims at representing the solution operator by an MLP that takes latent variables as input. The latent variables will typically correspond to parameters in a parametrization of input data such as boundary conditions, coefficients, and right-hand sides. The loss function is most often an energy functional and we formulate efficient parallelizable training algorithms based on assembling the energy locally on each element. For large problems, the learning process can be made more efficient by using only a small fraction of randomly chosen elements in the mesh in each iteration. The approach is evaluated on several relevant test cases, where learning the solution operator turns out to be beneficial compared to classical numerical methods.
翻译:我们提出并评估了一种用于学习由偏微分方程控制的非线性问题解算子的方法。该方法基于有限元离散化,旨在通过一个以隐变量作为输入的多层感知机(MLP)来表示解算子。隐变量通常对应于输入数据参数化中的参数,例如边界条件、系数和右端项。损失函数通常为能量泛函,我们基于在每个单元上局部组装能量,构建了高效的并行化训练算法。对于大规模问题,通过在每次迭代中仅使用网格中随机选取的一小部分单元,可以使学习过程更加高效。该方法在多个相关测试案例中进行了评估,结果表明,与经典数值方法相比,学习解算子具有显著优势。