Complex Query Answering (CQA) over Knowledge Graphs (KGs) is a challenging task. Given that KGs are usually incomplete, neural models are proposed to solve CQA by performing multi-hop logical reasoning. However, most of them cannot perform well on both one-hop and multi-hop queries simultaneously. Recent work proposes a logical message passing mechanism based on the pre-trained neural link predictors. While effective on both one-hop and multi-hop queries, it ignores the difference between the constant and variable nodes in a query graph. In addition, during the node embedding update stage, this mechanism cannot dynamically measure the importance of different messages, and whether it can capture the implicit logical dependencies related to a node and received messages remains unclear. In this paper, we propose Conditional Logical Message Passing Transformer (CLMPT), which considers the difference between constants and variables in the case of using pre-trained neural link predictors and performs message passing conditionally on the node type. We empirically verified that this approach can reduce computational costs without affecting performance. Furthermore, CLMPT uses the transformer to aggregate received messages and update the corresponding node embedding. Through the self-attention mechanism, CLMPT can assign adaptive weights to elements in an input set consisting of received messages and the corresponding node and explicitly model logical dependencies between various elements. Experimental results show that CLMPT is a new state-of-the-art neural CQA model. https://github.com/qianlima-lab/CLMPT.
翻译:知识图谱上的复杂查询应答是一项具有挑战性的任务。鉴于知识图谱通常不完整,研究者提出了通过执行多跳逻辑推理来解决复杂查询的神经模型。然而,大多数模型难以同时在单跳和多跳查询上取得良好性能。近期工作提出了一种基于预训练神经链接预测器的逻辑消息传递机制。该机制虽然在单跳和多跳查询上均有效,但忽略了查询图中常量节点与变量节点之间的差异。此外,在节点嵌入更新阶段,该机制无法动态衡量不同消息的重要性,且其是否能捕获与节点及接收消息相关的隐式逻辑依赖关系尚不明确。本文提出条件逻辑消息传递Transformer,该模型在使用预训练神经链接预测器的前提下考虑了常量与变量的差异,并根据节点类型有条件地执行消息传递。我们通过实验验证了该方法可以在不影响性能的情况下降低计算成本。此外,CLMPT利用Transformer聚合接收到的消息并更新相应节点嵌入。通过自注意力机制,CLMPT能够为由接收消息和对应节点组成的输入集合中的元素分配自适应权重,并显式建模各元素间的逻辑依赖关系。实验结果表明,CLMPT是一种新的最先进的神经复杂查询应答模型。https://github.com/qianlima-lab/CLMPT。