Graph convolutions have gained popularity due to their ability to efficiently operate on data with an irregular geometric structure. However, graph convolutions cause over-smoothing, which refers to representations becoming more similar with increased depth. However, many different definitions and intuitions currently coexist, leading to research efforts focusing on incompatible directions. This paper attempts to align these directions by showing that over-smoothing is merely a special case of power iteration. This greatly simplifies the existing theory on over-smoothing, making it more accessible. Based on the theory, we provide a novel comprehensive definition of rank collapse as a generalized form of over-smoothing and introduce the rank-one distance as a corresponding metric. Our empirical evaluation of 14 commonly used methods shows that more models than were previously known suffer from this issue.
翻译:图卷积因其能在不规则几何结构数据上高效运算而广受欢迎。然而,图卷积会导致过平滑现象,即随着网络深度增加,节点表征会趋于相似。目前学界对过平滑存在多种不同的定义与理解,导致研究方向出现分歧。本文通过证明过平滑仅是幂迭代的特例,尝试统一这些研究方向。这一发现极大简化了现有过平滑理论,使其更易于理解。基于该理论,我们提出了秩坍缩作为过平滑广义形式的新定义,并引入秩一距离作为相应度量指标。通过对14种常用方法的实证评估,我们发现存在该问题的模型比既往认知更为普遍。