Message Passing Graph Neural Networks are known to suffer from two problems that are sometimes believed to be diametrically opposed: over-squashing and over-smoothing. The former results from topological bottlenecks that hamper the information flow from distant nodes and are mitigated by spectral gap maximization, primarily, by means of edge additions. However, such additions often promote over-smoothing that renders nodes of different classes less distinguishable. Inspired by the Braess phenomenon, we argue that deleting edges can address over-squashing and over-smoothing simultaneously. This insight explains how edge deletions can improve generalization, thus connecting spectral gap optimization to a seemingly disconnected objective of reducing computational resources by pruning graphs for lottery tickets. To this end, we propose a more effective spectral gap optimization framework to add or delete edges and demonstrate its effectiveness on large heterophilic datasets.
翻译:消息传递图神经网络已知存在两个有时被认为相互对立的问题:过度挤压与过度平滑。前者源于拓扑瓶颈,阻碍了来自遥远节点的信息流动,通常通过谱间隙最大化(主要借助边添加)来缓解。然而,此类添加往往会加剧过度平滑现象,导致不同类别的节点更难以区分。受布雷斯悖论启发,我们认为删除边可以同时解决过度挤压与过度平滑问题。这一见解解释了边删除如何提升泛化性能,从而将谱间隙优化与图剪枝(通过寻找彩票假设来减少计算资源)这一看似无关的目标联系起来。为此,我们提出了一种更有效的谱间隙优化框架来添加或删除边,并在大型异配性数据集上验证了其有效性。