We introduce estimatable variation neural networks (EVNNs), a class of neural networks that allow a computationally cheap estimate on the $BV$ norm motivated by the space $BMV$ of functions with bounded M-variation. We prove a universal approximation theorem for EVNNs and discuss possible implementations. We construct sequences of loss functionals for ODEs and scalar hyperbolic conservation laws for which a vanishing loss leads to convergence. Moreover, we show the existence of sequences of loss minimizing neural networks if the solution is an element of $BMV$. Several numerical test cases illustrate that it is possible to use standard techniques to minimize these loss functionals for EVNNs.
翻译:我们引入可估计变差神经网络(EVNNs),这是一类受有界M变差函数空间BMV启发、允许对$BV$范数进行低成本计算估计的神经网络。我们证明了EVNNs的通用逼近定理并讨论了可能的实现方案。针对常微分方程和标量双曲守恒律,我们构建了损失函数序列,其损失值趋零可导出收敛性。此外,若解属于$BMV$空间,我们证明了损失最小化神经网络序列的存在性。多个数值测试案例表明,使用标准优化技术即可实现EVNNs对这些损失函数的最小化。