Message-passing graph neural networks (GNNs) excel at capturing local relationships but struggle with long-range dependencies in graphs. In contrast, graph transformers (GTs) enable global information exchange but often oversimplify the graph structure by representing graphs as sets of fixed-length vectors. This work introduces a novel architecture that overcomes the shortcomings of both approaches by combining the long-range information of random walks with local message passing. By treating random walks as sequences, our architecture leverages recent advances in sequence models to effectively capture long-range dependencies within these walks. Based on this concept, we propose a framework that offers (1) more expressive graph representations through random walk sequences, (2) the ability to utilize any sequence model for capturing long-range dependencies, and (3) the flexibility by integrating various GNN and GT architectures. Our experimental evaluations demonstrate that our approach achieves significant performance improvements on 19 graph and node benchmark datasets, notably outperforming existing methods by up to 13\% on the PascalVoc-SP and COCO-SP datasets. The code is available at https://github.com/BorgwardtLab/NeuralWalker.
翻译:消息传递图神经网络(GNNs)在捕捉局部关系方面表现出色,但在处理图中的长程依赖关系时存在困难。相比之下,图Transformer(GTs)能够实现全局信息交换,但通常将图表示为固定长度向量的集合,从而过度简化了图结构。本研究提出了一种新颖的架构,通过将随机游走的长程信息与局部消息传递相结合,克服了这两种方法的缺点。通过将随机游走视为序列,我们的架构利用序列模型的最新进展,有效捕捉这些游走内部的长程依赖关系。基于这一概念,我们提出了一个框架,该框架具备以下特点:(1)通过随机游走序列提供更具表现力的图表示;(2)能够利用任何序列模型来捕捉长程依赖关系;(3)通过整合各种GNN和GT架构实现灵活性。我们的实验评估表明,该方法在19个图和节点基准数据集上取得了显著的性能提升,特别是在PascalVoc-SP和COCO-SP数据集上,性能比现有方法高出最多13%。代码可在 https://github.com/BorgwardtLab/NeuralWalker 获取。