Graph neural networks (GNNs) are widely believed to excel at node representation learning through trainable neighborhood aggregations. We challenge this view by introducing Fixed Aggregation Features (FAFs), a training-free approach that transforms graph learning tasks into tabular problems. This simple shift enables the use of well-established tabular methods, offering strong interpretability and the flexibility to deploy diverse classifiers. Across 14 benchmarks, well-tuned multilayer perceptrons trained on FAFs rival or outperform state-of-the-art GNNs and graph transformers on 12 tasks -- often using only mean aggregation. The only exceptions are the Roman Empire and Minesweeper datasets, which typically require unusually deep GNNs. To explain the theoretical possibility of non-trainable aggregations, we connect our findings to Kolmogorov-Arnold representations and discuss when mean aggregation can be sufficient. In conclusion, our results call for (i) richer benchmarks benefiting from learning diverse neighborhood aggregations, (ii) strong tabular baselines as standard, and (iii) employing and advancing tabular models for graph data to gain new insights into related tasks.
翻译:图神经网络(GNNs)被广泛认为通过可训练的邻域聚合在节点表示学习方面表现出色。我们通过引入固定聚合特征(FAFs)——一种将图学习任务转化为表格问题的免训练方法——对这一观点提出挑战。这一简单的转变使得能够使用成熟的表格方法,提供强大的可解释性以及部署多样化分类器的灵活性。在14个基准测试中,基于FAFs训练并经过良好调优的多层感知机在12项任务上媲美甚至超越了最先进的GNNs和图Transformer模型,且通常仅使用均值聚合。唯一的例外是罗马帝国和扫雷数据集,这些数据集通常需要异常深层的GNNs。为解释非可训练聚合在理论上的可能性,我们将研究结果与Kolmogorov-Arnold表示理论联系起来,并讨论了均值聚合在何种条件下可能足够有效。总之,我们的研究结果呼吁:(i)建立更能从学习多样化邻域聚合中受益的丰富基准测试;(ii)将强大的表格基线方法作为标准配置;(iii)采用并推进针对图数据的表格模型,以获得对相关任务的新见解。