PDE-based Group Convolutional Neural Networks (PDE-G-CNNs) utilize solvers of geometrically meaningful evolution PDEs as substitutes for the conventional components in G-CNNs. PDE-G-CNNs offer several key benefits all at once: fewer parameters, inherent equivariance, better performance, data efficiency, and geometric interpretability. In this article we focus on Euclidean equivariant PDE-G-CNNs where the feature maps are two dimensional throughout. We call this variant of the framework a PDE-CNN. We list several practically desirable axioms and derive from these which PDEs should be used in a PDE-CNN. Here our approach to geometric learning via PDEs is inspired by the axioms of classical linear and morphological scale-space theory, which we generalize by introducing semifield-valued signals. Furthermore, we experimentally confirm for small networks that PDE-CNNs offer fewer parameters, better performance, and data efficiency in comparison to CNNs. We also investigate what effect the use of different semifields has on the performance of the models.
翻译:基于偏微分方程的分组卷积神经网络(PDE-G-CNNs)采用几何意义明确的演化型偏微分方程求解器,替代G-CNNs中的传统组件。PDE-G-CNNs同时具备多项关键优势:参数数量更少、固有等变性、性能更优、数据效率更高以及几何可解释性。本文专注于欧几里得等变PDE-G-CNNs,其中特征图始终保持二维结构,我们将该框架变体称为PDE-CNN。我们列举了若干实际所需的公理,并据此推导出PDE-CNN中应采用的偏微分方程形式。此处通过偏微分方程实现几何学习的方法,受经典线性与形态学尺度空间理论公理的启发,通过引入半环值信号对其进行推广。此外,通过小规模网络实验验证,相较于CNN,PDE-CNN在参数数量、性能表现及数据效率方面均具有优势。我们还探究了不同半环类型对模型性能的影响。