While Transformers have demonstrated remarkable potential in modeling Partial Differential Equations (PDEs), modeling large-scale unstructured meshes with complex geometries remains a significant challenge. Existing efficient architectures often employ feature dimensionality reduction strategies, which inadvertently induces Geometric Aliasing, resulting in the loss of critical physical boundary information. To address this, we propose the Physics-Geometry Operator Transformer (PGOT), designed to reconstruct physical feature learning through explicit geometry awareness. Specifically, we propose Spectrum-Preserving Geometric Attention (SpecGeo-Attention). Utilizing a ``physics slicing-geometry injection" mechanism, this module incorporates multi-scale geometric encodings to explicitly preserve multi-scale geometric features while maintaining linear computational complexity $O(N)$. Furthermore, PGOT dynamically routes computations to low-order linear paths for smooth regions and high-order non-linear paths for shock waves and discontinuities based on spatial coordinates, enabling spatially adaptive and high-precision physical field modeling. PGOT achieves consistent state-of-the-art performance across four standard benchmarks and excels in large-scale industrial tasks including airfoil and car designs.
翻译:尽管Transformer在建模偏微分方程方面展现出巨大潜力,但对具有复杂几何结构的大规模非结构化网格进行建模仍是一项重大挑战。现有高效架构常采用特征降维策略,这会无意中引发几何混叠效应,导致关键物理边界信息丢失。为此,我们提出物理-几何算子Transformer,其设计目标是通过显式几何感知重构物理特征学习。具体而言,我们提出谱保持几何注意力模块。该模块采用"物理切片-几何注入"机制,通过融入多尺度几何编码来显式保持多尺度几何特征,同时维持线性计算复杂度$O(N)$。此外,PGOT能根据空间坐标动态路由计算:对平滑区域采用低阶线性路径,对激波和间断区域采用高阶非线性路径,从而实现空间自适应的高精度物理场建模。PGOT在四个标准基准测试中均取得稳定的最先进性能,并在翼型与汽车设计等大规模工业任务中表现卓越。