Advancement in finite element methods have become essential in various disciplines, and in particular for Computational Fluid Dynamics (CFD), driving research efforts for improved precision and efficiency. While Convolutional Neural Networks (CNNs) have found success in CFD by mapping meshes into images, recent attention has turned to leveraging Graph Neural Networks (GNNs) for direct mesh processing. This paper introduces a novel model merging Self-Attention with Message Passing in GNNs, achieving a 15\% reduction in RMSE on the well known flow past a cylinder benchmark. Furthermore, a dynamic mesh pruning technique based on Self-Attention is proposed, that leads to a robust GNN-based multigrid approach, also reducing RMSE by 15\%. Additionally, a new self-supervised training method based on BERT is presented, resulting in a 25\% RMSE reduction. The paper includes an ablation study and outperforms state-of-the-art models on several challenging datasets, promising advancements similar to those recently achieved in natural language and image processing. Finally, the paper introduces a dataset with meshes larger than existing ones by at least an order of magnitude. Code and Datasets will be released at https://github.com/DonsetPG/multigrid-gnn.
翻译:有限元方法的进步已成为各学科发展的关键,特别是在计算流体动力学领域,推动着精度与效率提升的研究努力。虽然卷积神经网络通过将网格映射为图像在CFD中取得了成功,但近期研究重点已转向利用图神经网络进行直接网格处理。本文提出了一种新颖的模型,将自注意力机制与GNN中的消息传递相结合,在经典的圆柱绕流基准测试中实现了均方根误差降低15%。此外,我们提出了一种基于自注意力的动态网格剪枝技术,形成了鲁棒的基于GNN的多网格方法,同样实现了15%的RMSE降低。我们还提出了一种基于BERT的新型自监督训练方法,使RMSE进一步降低25%。本文包含消融实验研究,并在多个具有挑战性的数据集上超越了现有最优模型,展现出与自然语言和图像处理领域近期突破相媲美的进步潜力。最后,本文发布了网格规模比现有数据集至少大一个数量级的新数据集。代码与数据集将在https://github.com/DonsetPG/multigrid-gnn 公开。