Equivariant machine learning is an approach for designing deep learning models that respect the symmetries of the problem, with the aim of reducing model complexity and improving generalization. In this paper, we focus on an extension of shift equivariance, which is the basis of convolution networks on images, to general graphs. Unlike images, graphs do not have a natural notion of domain translation. Therefore, we consider the graph functional shifts as the symmetry group: the unitary operators that commute with the graph shift operator. Notably, such symmetries operate in the signal space rather than directly in the spatial space. We remark that each linear filter layer of a standard spectral graph neural network (GNN) commutes with graph functional shifts, but the activation function breaks this symmetry. Instead, we propose nonlinear spectral filters (NLSFs) that are fully equivariant to graph functional shifts and show that they have universal approximation properties. The proposed NLSFs are based on a new form of spectral domain that is transferable between graphs. We demonstrate the superior performance of NLSFs over existing spectral GNNs in node and graph classification benchmarks.
翻译:等变机器学习是一种设计深度学习模型的方法,旨在尊重问题的对称性,以降低模型复杂度并提升泛化能力。本文重点研究将图像上卷积网络基础的平移等变性推广至一般图结构。与图像不同,图结构不具备自然的域平移概念。因此,我们将图函数平移视为对称群:即与图平移算子可交换的酉算子。值得注意的是,此类对称性作用于信号空间而非直接作用于空间域。我们指出,标准谱图神经网络(GNN)的每个线性滤波层均与图函数平移可交换,但激活函数会破坏这种对称性。为此,我们提出完全对图函数平移等变的非线性谱滤波器(NLSF),并证明其具有通用逼近性质。所提出的NLSF基于一种可在图间迁移的新型谱域构建。我们在节点与图分类基准测试中验证了NLSF相较于现有谱GNN的优越性能。