The recently proposed Anchored-Branched Universal Physics Transformers (AB-UPT) shows strong capabilities to replicate automotive computational fluid dynamics simulations requiring orders of magnitudes less compute than traditional numerical solvers. In this technical report, we add two new datasets to the body of empirically evaluated use-cases of AB-UPT, combining high-quality data generation with state-of-the-art neural surrogates. Both datasets were generated with the Luminary Cloud platform containing automotives (SHIFT-SUV) and aircrafts (SHIFT-Wing). We start by detailing the data generation. Next, we show favorable performances of AB-UPT against previous state-of-the-art transformer-based baselines on both datasets, followed by extensive qualitative and quantitative evaluations of our best AB-UPT model. AB-UPT shows strong performances across the board. Notably, it obtains near perfect prediction of integrated aerodynamic forces within seconds from a simple isotopically tesselate geometry representation and is trainable within a day on a single GPU, paving the way for industry-scale applications.
翻译:最近提出的锚定分支通用物理Transformer(AB-UPT)展现出强大的能力,能够复现汽车计算流体动力学模拟,且所需计算量比传统数值求解器低数个数量级。在本技术报告中,我们为AB-UPT的实证评估用例体系新增了两个数据集,将高质量数据生成与最先进的神经代理模型相结合。这两个数据集均通过Luminary Cloud平台生成,包含汽车(SHIFT-SUV)与飞机(SHIFT-Wing)模型。我们首先详细说明数据生成过程。接着,我们展示AB-UPT在两个数据集上相较于先前最先进的基于Transformer的基线模型均具有优越性能,随后对我们最佳的AB-UPT模型进行了广泛的定性与定量评估。AB-UPT在所有方面均表现出强劲性能。值得注意的是,它能在数秒内基于简单的各向同性网格化几何表示近乎完美地预测整合气动力,并且可在单GPU上一天内完成训练,这为工业级应用铺平了道路。