We use elliptic partial differential equations (PDEs) as examples to show various properties and behaviors when shallow neural networks (SNNs) are used to represent the solutions. In particular, we study the numerical ill-conditioning, frequency bias, and the balance between the differential operator and the shallow network representation for different formulations of the PDEs and with various activation functions. Our study shows that the performance of Physics-Informed Neural Networks (PINNs) or Deep Ritz Method (DRM) using linear SNNs with power ReLU activation is dominated by their inherent ill-conditioning and spectral bias against high frequencies. Although this can be alleviated by using non-homogeneous activation functions with proper scaling, achieving such adaptivity for nonlinear SNNs remains costly due to ill-conditioning.
翻译:本文以椭圆型偏微分方程为例,探讨了使用浅层神经网络表示解时的多种性质与行为。具体而言,我们研究了不同偏微分方程形式及各类激活函数下数值病态性、频率偏差以及微分算子与浅层网络表示之间的平衡关系。研究表明,采用幂次ReLU激活的线性浅层神经网络构建的物理信息神经网络或深度Ritz方法,其性能主要受固有的病态性和对高频分量的谱偏差所主导。尽管通过使用具有适当缩放的非齐次激活函数可以缓解这一问题,但对于非线性浅层神经网络而言,由于病态性的存在,实现此类自适应仍代价高昂。