Group equivariant neural networks are growing in importance owing to their ability to generalise well in applications where the data has known underlying symmetries. Recent characterisations of a class of these networks that use high-order tensor power spaces as their layers suggest that they have significant potential; however, their implementation remains challenging owing to the prohibitively expensive nature of the computations that are involved. In this work, we present a fast matrix multiplication algorithm for any equivariant weight matrix that maps between tensor power layer spaces in these networks for four groups: the symmetric, orthogonal, special orthogonal, and symplectic groups. We obtain this algorithm by developing a diagrammatic framework based on category theory that enables us to not only express each weight matrix as a linear combination of diagrams but also makes it possible for us to use these diagrams to factor the original computation into a series of steps that are optimal. We show that this algorithm improves the Big-$O$ time complexity exponentially in comparison to a na\"{i}ve matrix multiplication.
翻译:群等变神经网络因其在具有已知底层对称性的数据应用中展现出良好的泛化能力而日益重要。近期研究对一类使用高阶张量幂空间作为网络层的此类网络进行了刻画,表明其具有显著潜力;然而,由于所涉及的计算量极其庞大,其实际实现仍面临挑战。在本工作中,我们针对对称群、正交群、特殊正交群和辛群这四类群,提出了一种适用于任意等变权重矩阵的快速矩阵乘法算法,该算法用于这些网络中张量幂层空间之间的映射。我们通过发展一种基于范畴论的图解法框架来获得该算法,该框架不仅使我们能够将每个权重矩阵表示为图的线性组合,还使我们能够利用这些图将原始计算分解为一系列最优步骤。我们证明,与朴素矩阵乘法相比,该算法将Big-$O$时间复杂度实现了指数级提升。