Physics Informed Neural Networks (PINNs) offer several advantages when compared to traditional numerical methods for solving PDEs, such as being a mesh-free approach and being easily extendable to solving inverse problems. One promising approach for allowing PINNs to scale to multi-scale problems is to combine them with domain decomposition; for example, finite basis physics-informed neural networks (FBPINNs) replace the global PINN network with many localised networks which are summed together to approximate the solution. In this work, we significantly accelerate the training of FBPINNs by linearising their underlying optimisation problem. We achieve this by employing extreme learning machines (ELMs) as their subdomain networks and showing that this turns the FBPINN optimisation problem into one of solving a linear system or least-squares problem. We test our workflow in a preliminary fashion by using it to solve an illustrative 1D problem.
翻译:物理信息神经网络(PINNs)在求解偏微分方程时,相较于传统数值方法具有若干优势,例如其无需网格划分的特性以及易于扩展至求解反问题的能力。一种有望使PINNs能够扩展至多尺度问题的途径是将其与区域分解方法相结合;例如,有限基物理信息神经网络(FBPINNs)将全局PINN网络替换为多个局部网络,并通过求和来近似解。在本工作中,我们通过线性化其底层优化问题,显著加速了FBPINNs的训练。我们通过采用极限学习机(ELMs)作为其子域网络来实现这一点,并证明这将FBPINN优化问题转化为求解线性系统或最小二乘问题。我们通过求解一个示例性的一维问题,以初步的方式测试了我们的工作流程。