We introduce a Graph Transformer framework that serves as a general inverse physics engine on meshes, demonstrated through the challenging task of reconstructing aerodynamic flow fields from sparse surface measurements. While deep learning has shown promising results in forward physics simulation, inverse problems remain particularly challenging due to their ill-posed nature and the difficulty of propagating information from limited boundary observations. Our approach addresses these challenges by combining the geometric expressiveness of message-passing neural networks with the global reasoning of Transformers, enabling efficient learning of inverse mappings from boundary conditions to complete states. We evaluate this framework on a comprehensive dataset of steady-state RANS simulations around diverse airfoil geometries, where the task is to reconstruct full pressure and velocity fields from surface pressure measurements alone. The architecture achieves high reconstruction accuracy while maintaining fast inference times. We conduct experiments and provide insights into the relative importance of local geometric processing and global attention mechanisms in mesh-based inverse problems. We also find that the framework is robust to reduced sensor coverage. These results suggest that Graph Transformers can serve as effective inverse physics engines across a broader range of applications where complete system states must be reconstructed from limited boundary observations.
翻译:本文提出一种图神经网络框架,作为网格上的通用逆物理引擎,并通过从稀疏表面测量数据重构空气动力学流场这一挑战性任务进行验证。尽管深度学习在前向物理模拟中已展现出良好效果,但由于逆问题的不适定性以及从有限边界观测信息传播的困难,此类问题仍极具挑战性。我们的方法通过将消息传递神经网络的几何表达能力与Transformer的全局推理能力相结合,实现了从边界条件到完整状态的高效逆映射学习。该框架在涵盖多种翼型几何的稳态RANS仿真数据集上进行评估,其任务仅通过表面压力测量重构完整的压力场与速度场。该架构在保持快速推理速度的同时实现了高精度重构。我们通过实验揭示了局部几何处理与全局注意力机制在网格逆问题中的相对重要性,并发现该框架对传感器覆盖范围缩减具有鲁棒性。这些结果表明,图神经网络能够在更广泛的应用场景中作为有效的逆物理引擎,实现从有限边界观测重构完整系统状态的目标。