Signed graphs allow for encoding positive and negative relations between nodes and are used to model various online activities. Node representation learning for signed graphs is a well-studied task with important applications such as sign prediction. While the size of datasets is ever-increasing, recent methods often sacrifice scalability for accuracy. We propose a novel message-passing layer architecture called Graph Spring Network (GSN) modeled after spring forces. We combine it with a Graph Neural Ordinary Differential Equations (ODEs) formalism to optimize the system dynamics in embedding space to solve a downstream prediction task. Once the dynamics is learned, embedding generation for novel datasets is done by solving the ODEs in time using a numerical integration scheme. Our GSN layer leverages the fast-to-compute edge vector directions and learnable scalar functions that only depend on nodes' distances in latent space to compute the nodes' positions. Conversely, Graph Convolution and Graph Attention Network layers rely on learnable vector functions that require the full positions of input nodes in latent space. We propose a specific implementation called Spring-Neural-Network (SPR-NN) using a set of small neural networks mimicking attracting and repulsing spring forces that we train for link sign prediction. Experiments show that our method achieves accuracy close to the state-of-the-art methods with node generation time speedup factors of up to 28,000 on large graphs.
翻译:符号图能够编码节点间的正向与负向关系,常用于建模各类在线活动。针对符号图的节点表示学习是一个被深入研究的任务,在符号预测等重要应用中具有关键价值。随着数据集规模持续增长,现有方法往往以牺牲可扩展性为代价换取精度。本文提出一种受弹簧力启发的新型消息传递层架构——图弹簧网络(GSN),并将其与图神经常微分方程(ODE)形式相结合,通过优化嵌入空间的系统动力学来解决下游预测任务。一旦动力学特性被学习,对新数据集的嵌入生成可通过数值积分方法求解时域ODE实现。我们的GSN层利用快速计算的边向量方向以及仅依赖于节点在隐空间中距离的可学习标量函数来计算节点位置。相比之下,图卷积网络与图注意力网络层依赖于需要隐空间中完整节点位置信息的可学习向量函数。我们提出一种具体实现方案——弹簧神经网络(SPR-NN),通过一组模拟吸引与排斥弹簧力的小型神经网络进行训练,以完成链接符号预测任务。实验表明,本方法在大型图上达到接近最先进方法的精度,同时节点生成时间加速比最高可达28,000倍。