The least-squares ReLU neural network (LSNN) method was introduced and studied for solving linear advection-reaction equation with discontinuous solution in \cite{Cai2021linear,cai2023least}. The method is based on an equivalent least-squares formulation and employs ReLU neural network (NN) functions with $\lceil \log_2(d+1)\rceil+1$-layer representations for approximating solutions. In this paper, we show theoretically that the method is also capable of approximating non-constant jumps along discontinuous interfaces that are not necessarily straight lines. Numerical results for test problems with various non-constant jumps and interfaces show that the LSNN method with $\lceil \log_2(d+1)\rceil+1$ layers approximates solutions accurately with degrees of freedom less than that of mesh-based methods and without the common Gibbs phenomena along discontinuous interfaces.
翻译:最小二乘ReLU神经网络(LSNN)方法在文献\cite{Cai2021linear,cai2023least}中被提出并用于求解具有间断解的线性对流反应方程。该方法基于等价的最小二乘公式,采用具有$\lceil \log_2(d+1)\rceil+1$层表示的ReLU神经网络函数来逼近解。本文从理论上证明,该方法同样能够逼近沿非必要为直线的间断界面的非常数跳跃。针对具有各类非常数跳跃和界面的测试问题的数值结果表明,采用$\lceil \log_2(d+1)\rceil+1$层的LSNN方法能够精确逼近解,其自由度少于基于网格的方法,并且沿间断界面不存在常见的吉布斯现象。