Trained neural networks (NN) have attractive features for closing governing equations. There are many methods that are showing promise, but all can fail in cases when small errors consequentially violate physical reality, such as a solution boundedness condition. A NN formulation is introduced to preclude spurious oscillations that violate solution boundedness or positivity. It is embedded in the discretized equations as a machine learning closure and strictly constrained, inspired by total variation diminishing (TVD) methods for hyperbolic conservation laws. The constraint is exactly enforced during gradient-descent training by rescaling the NN parameters, which maps them onto an explicit feasible set. Demonstrations show that the constrained NN closure model usefully recovers linear and nonlinear hyperbolic phenomena and anti-diffusion while enforcing the non-oscillatory property. Finally, the model is applied to subgrid-scale (SGS) modeling of a turbulent reacting flow, for which it suppresses spurious oscillations in scalar fields that otherwise violate the solution boundedness. It outperforms a simple penalization of oscillations in the loss function.
翻译:经过训练的神经网络(NN)在封闭控制方程方面具有吸引人的特性。虽然已有许多方法展现出潜力,但当微小误差最终违反物理现实(如解的有界性条件)时,所有方法都可能失效。本文提出了一种神经网络构建方法,旨在消除违反解有界性或正性的伪振荡。该方法受双曲守恒律的总变差递减(TVD)方法启发,作为机器学习封闭项嵌入离散化方程中,并受到严格约束。该约束在梯度下降训练过程中通过重新缩放神经网络参数得以精确执行,从而将参数映射到一个显式的可行集上。验证结果表明,该受约束的神经网络封闭模型在保持非振荡特性的同时,有效地恢复了线性和非线性双曲现象以及反扩散效应。最后,将该模型应用于湍流反应流的亚网格尺度(SGS)建模,成功抑制了标量场中原本会违反解有界性的伪振荡。其性能优于在损失函数中简单惩罚振荡的方法。