We propose a new symplectic convolutional neural network (CNN) architecture by leveraging symplectic neural networks, proper symplectic decomposition, and tensor techniques. Specifically, we first introduce a mathematically equivalent form of the convolution layer and then, using symplectic neural networks, we demonstrate a way to parameterize the layers of the CNN to ensure that the convolution layer remains symplectic. To construct a complete autoencoder, we introduce a symplectic pooling layer. We demonstrate the performance of the proposed neural network on three examples: the wave equation, the nonlinear Schrödinger (NLS) equation, and the sine-Gordon equation. The numerical results indicate that the symplectic CNN outperforms the linear symplectic autoencoder obtained via proper symplectic decomposition.
翻译:我们通过利用辛几何神经网络、恰当辛几何分解以及张量技术,提出了一种新的辛几何卷积神经网络(CNN)架构。具体而言,我们首先引入了卷积层的一种数学等价形式,然后利用辛几何神经网络,展示了一种对CNN各层进行参数化的方法,以确保卷积层保持辛几何特性。为了构建一个完整的自编码器,我们引入了辛几何池化层。我们在三个示例上展示了所提出神经网络的性能:波动方程、非线性薛定谔(NLS)方程以及正弦-戈登方程。数值结果表明,辛几何CNN的性能优于通过恰当辛几何分解得到的线性辛几何自编码器。