Signed graphs allow for encoding positive and negative relations between nodes and are used to model various online activities. Node representation learning for signed graphs is a well-studied task with important applications such as sign prediction. While the size of datasets is ever-increasing, recent methods often sacrifice scalability for accuracy. We propose a novel message-passing layer architecture called Graph Spring Network (GSN) modeled after spring forces. We combine it with a Graph Neural Ordinary Differential Equations (ODEs) formalism to optimize the system dynamics in embedding space to solve a downstream prediction task. Once the dynamics is learned, embedding generation for novel datasets is done by solving the ODEs in time using a numerical integration scheme. Our GSN layer leverages the fast-to-compute edge vector directions and learnable scalar functions that only depend on nodes' distances in latent space to compute the nodes' positions. Conversely, Graph Convolution and Graph Attention Network layers rely on learnable vector functions that require the full positions of input nodes in latent space. We propose a specific implementation called Spring-Neural-Network (SPR-NN) using a set of small neural networks mimicking attracting and repulsing spring forces that we train for link sign prediction. Experiments show that our method achieves accuracy close to the state-of-the-art methods with node generation time speedup factors of up to 28,000 on large graphs.
翻译:符号图允许编码节点间的正负关系,并用于建模各种在线活动。符号图的节点表示学习是一个被深入研究的重要任务,其应用包括符号预测等。随着数据集规模不断增长,现有方法往往以牺牲可扩展性来换取准确性。我们提出了一种受弹簧力启发的新型消息传递层架构,称为图弹簧网络(GSN)。我们将其与图神经常微分方程(ODEs)形式相结合,通过优化嵌入空间中的系统动力学来解决下游预测任务。一旦动力学被学习,新数据集的嵌入生成可通过数值积分方案在时间维度上求解ODEs实现。我们的GSN层利用快速计算的边向量方向以及仅依赖于节点在潜在空间中距离的可学习标量函数来计算节点位置。相比之下,图卷积和图注意力网络层依赖于需要输入节点在潜在空间中完整位置信息的可学习向量函数。我们提出了一种具体实现方案——弹簧神经网络(SPR-NN),它使用一组模拟吸引与排斥弹簧力的小型神经网络进行训练,用于链接符号预测。实验表明,我们的方法在大型图上实现了与最先进方法相近的预测精度,同时节点生成时间加速比最高可达28,000倍。