We present a generative reduced basis (RB) approach to construct reduced order models for parametrized partial differential equations. Central to this approach is the construction of generative RB spaces that provide rapidly convergent approximations of the solution manifold. We introduce a generative snapshot method to generate significantly larger sets of snapshots from a small initial set of solution snapshots. This method leverages multivariate nonlinear transformations to enrich the RB spaces, allowing for a more accurate approximation of the solution manifold than commonly used techniques such as proper orthogonal decomposition and greedy sampling. The key components of our approach include (i) a Galerkin projection of the full order model onto the generative RB space to form the reduced order model; (ii) a posteriori error estimates to certify the accuracy of the reduced order model; and (iii) an offline-online decomposition to separate the computationally intensive model construction, performed once during the offline stage, from the real-time model evaluations performed many times during the online stage. The error estimates allow us to efficiently explore the parameter space and select parameter points that maximize the accuracy of the reduced order model. Through numerical experiments, we demonstrate that the generative RB method not only improves the accuracy of the reduced order model but also provides tight error estimates.
翻译:本文提出了一种用于构建参数化偏微分方程降阶模型的生成式降阶基方法。该方法的核心在于构建生成式降阶基空间,该空间能够为解流形提供快速收敛的近似。我们引入了一种生成式快照方法,能够从少量初始解快照生成数量显著增大的快照集合。该方法利用多元非线性变换来丰富降阶基空间,相较于常用技术(如本征正交分解和贪婪采样),能够更精确地近似解流形。我们方法的关键组成部分包括:(i) 将全阶模型通过伽辽金投影映射到生成式降阶基空间以形成降阶模型;(ii) 用于验证降阶模型精度的后验误差估计;(iii) 离线-在线分解,将计算密集型的模型构建(在离线阶段执行一次)与实时模型评估(在在线阶段多次执行)分离开来。误差估计使我们能够高效探索参数空间,并选择能够最大化降阶模型精度的参数点。通过数值实验,我们证明生成式降阶基方法不仅提高了降阶模型的精度,而且提供了严格的误差估计。