Group equivariance is a strong inductive bias useful in a wide range of deep learning tasks. However, constructing efficient equivariant networks for general groups and domains is difficult. Recent work by Finzi et al. (2021) directly solves the equivariance constraint for arbitrary matrix groups to obtain equivariant MLPs (EMLPs). But this method does not scale well and scaling is crucial in deep learning. Here, we introduce Group Representation Networks (G-RepsNets), a lightweight equivariant network for arbitrary matrix groups with features represented using tensor polynomials. The key intuition for our design is that using tensor representations in the hidden layers of a neural network along with simple inexpensive tensor operations can lead to expressive universal equivariant networks. We find G-RepsNet to be competitive to EMLP on several tasks with group symmetries such as O(5), O(1, 3), and O(3) with scalars, vectors, and second-order tensors as data types. On image classification tasks, we find that G-RepsNet using second-order representations is competitive and often even outperforms sophisticated state-of-the-art equivariant models such as GCNNs (Cohen & Welling, 2016a) and E(2)-CNNs (Weiler & Cesa, 2019). To further illustrate the generality of our approach, we show that G-RepsNet is competitive to G-FNO (Helwig et al., 2023) and EGNN (Satorras et al., 2021) on N-body predictions and solving PDEs, respectively, while being efficient.
翻译:群等变性是深度学习任务中一种强有力的归纳偏置,但为通用群和域构建高效等变网络仍存在困难。Finzi等人(2021)近期工作通过直接求解任意矩阵群的等变约束获得等变MLP(EMLP),但该方法可扩展性不足,而扩展性在深度学习中至关重要。本文提出群表示网络(Group Representation Networks, G-RepsNets)——一种面向任意矩阵群的轻量级等变网络,其核心设计理念是:在神经网络隐层采用张量表示,并结合简单低成本的张量运算,即可构建具有表达能力的通用等变网络。我们通过实验证明,在处理标量、向量及二阶张量等数据类型时,G-RepsNet在O(5)、O(1,3)和O(3)等群对称性任务上性能与EMLP相当。在图像分类任务中,采用二阶表示的G-RepsNet不仅与GCNN(Cohen & Welling, 2016a)和E(2)-CNN(Weiler & Cesa, 2019)等先进等变模型性能持平,甚至常超越后者。为展示方法的通用性,我们进一步证明G-RepsNet在N体预测和偏微分方程求解任务中,分别与G-FNO(Helwig et al., 2023)和EGNN(Satorras et al., 2021)性能相当,同时保持高效性。