Permutation equivariant neural networks are typically used to learn from data that lives on a graph. However, for any graph $G$ that has $n$ vertices, using the symmetric group $S_n$ as its group of symmetries does not take into account the relations that exist between the vertices. Given that the actual group of symmetries is the automorphism group Aut$(G)$, we show how to construct neural networks that are equivariant to Aut$(G)$ by obtaining a full characterisation of the learnable, linear, Aut$(G)$-equivariant functions between layers that are some tensor power of $\mathbb{R}^{n}$. In particular, we find a spanning set of matrices for these layer functions in the standard basis of $\mathbb{R}^{n}$. This result has important consequences for learning from data whose group of symmetries is a finite group because a theorem by Frucht (1938) showed that any finite group is isomorphic to the automorphism group of a graph.
翻译:置换等变神经网络通常用于从图结构数据中学习。然而,对于具有$n$个顶点的任意图$G$,使用对称群$S_n$作为其对称群并未考虑顶点之间存在的连接关系。鉴于实际的对称群是自同构群Aut$(G)$,我们通过完整刻画在$\mathbb{R}^{n}$的张量幂构成的层之间可学习的、线性的、Aut$(G)$-等变的函数,展示了如何构建对Aut$(G)$等变的神经网络。特别地,我们在$\mathbb{R}^{n}$的标准基中找到了这些层函数矩阵的一个生成集。该结果对从对称群为有限群的数据中学习具有重要意义,因为Frucht(1938)的定理表明:任何有限群都同构于某个图的自同构群。