Oversmoothing in Graph Neural Networks (GNNs) poses a significant challenge as network depth increases, leading to homogenized node representations and a loss of expressiveness. In this work, we approach the oversmoothing problem from a dynamical systems perspective, providing a deeper understanding of the stability and convergence behavior of GNNs. Leveraging insights from dynamical systems theory, we identify the root causes of oversmoothing and propose \textbf{\textit{DYNAMO-GAT}}. This approach utilizes noise-driven covariance analysis and Anti-Hebbian principles to selectively prune redundant attention weights, dynamically adjusting the network's behavior to maintain node feature diversity and stability. Our theoretical analysis reveals how DYNAMO-GAT disrupts the convergence to oversmoothed states, while experimental results on benchmark datasets demonstrate its superior performance and efficiency compared to traditional and state-of-the-art methods. DYNAMO-GAT not only advances the theoretical understanding of oversmoothing through the lens of dynamical systems but also provides a practical and effective solution for improving the stability and expressiveness of deep GNNs.
翻译:图神经网络(GNNs)中的过平滑问题随着网络深度的增加构成了一项重大挑战,它导致节点表示同质化并丧失表达能力。本文从动力系统的视角探讨过平滑问题,为理解GNNs的稳定性和收敛行为提供了更深入的理论基础。借助动力系统理论的洞见,我们识别了过平滑的根本原因,并提出了 \textbf{\textit{DYNAMO-GAT}} 方法。该方法利用噪声驱动的协方差分析和反赫布原理,有选择性地剪枝冗余的注意力权重,动态调整网络行为以维持节点特征的多样性和稳定性。我们的理论分析揭示了DYNAMO-GAT如何打破向过平滑状态的收敛,而在基准数据集上的实验结果表明,相较于传统方法和前沿方法,DYNAMO-GAT具有更优的性能和效率。DYNAMO-GAT不仅通过动力系统的视角推进了对过平滑问题的理论理解,还为提升深度GNNs的稳定性和表达能力提供了一种实用且有效的解决方案。