Knowledge graph reasoning plays a vital role in various applications and has garnered considerable attention. Recently, path-based methods have achieved impressive performance. However, they may face limitations stemming from constraints in message-passing neural networks, such as missing paths and information over-squashing. In this paper, we revisit the application of transformers for knowledge graph reasoning to address the constraints faced by path-based methods and propose a novel method KnowFormer. KnowFormer utilizes a transformer architecture to perform reasoning on knowledge graphs from the message-passing perspective, rather than reasoning by textual information like previous pretrained language model based methods. Specifically, we define the attention computation based on the query prototype of knowledge graph reasoning, facilitating convenient construction and efficient optimization. To incorporate structural information into the self-attention mechanism, we introduce structure-aware modules to calculate query, key, and value respectively. Additionally, we present an efficient attention computation method for better scalability. Experimental results demonstrate the superior performance of KnowFormer compared to prominent baseline methods on both transductive and inductive benchmarks.
翻译:知识图谱推理在各种应用中扮演着关键角色,并已获得广泛关注。近年来,基于路径的方法取得了令人瞩目的性能。然而,这些方法可能面临源自消息传递神经网络固有局限性的挑战,例如路径缺失与信息过度压缩问题。本文重新审视Transformer在知识图谱推理中的应用,以解决基于路径方法所面临的约束,并提出了一种新颖的方法KnowFormer。KnowFormer采用Transformer架构,从消息传递的视角对知识图谱进行推理,而非依赖先前基于预训练语言模型方法所使用的文本信息推理机制。具体而言,我们基于知识图谱推理的查询原型定义注意力计算机制,从而实现便捷的构建与高效的优化。为将结构信息融入自注意力机制,我们分别引入了结构感知模块来计算查询、键与值向量。此外,我们提出了一种高效的注意力计算方法以提升模型可扩展性。实验结果表明,在转导式与归纳式基准测试中,KnowFormer相较于主流基线方法均展现出更优越的性能。