Complex Query Answering (CQA) over Knowledge Graphs (KGs) is a challenging task. Given that KGs are usually incomplete, neural models are proposed to solve CQA by performing multi-hop logical reasoning. However, most of them cannot perform well on both one-hop and multi-hop queries simultaneously. Recent work proposes a logical message passing mechanism based on the pre-trained neural link predictors. While effective on both one-hop and multi-hop queries, it ignores the difference between the constant and variable nodes in a query graph. In addition, during the node embedding update stage, this mechanism cannot dynamically measure the importance of different messages, and whether it can capture the implicit logical dependencies related to a node and received messages remains unclear. In this paper, we propose Conditional Logical Message Passing Transformer (CLMPT), which considers the difference between constants and variables in the case of using pre-trained neural link predictors and performs message passing conditionally on the node type. We empirically verified that this approach can reduce computational costs without affecting performance. Furthermore, CLMPT uses the transformer to aggregate received messages and update the corresponding node embedding. Through the self-attention mechanism, CLMPT can assign adaptive weights to elements in an input set consisting of received messages and the corresponding node and explicitly model logical dependencies between various elements. Experimental results show that CLMPT is a new state-of-the-art neural CQA model.
翻译:知识图谱上的复杂查询回答是一项具有挑战性的任务。由于知识图谱通常不完整,研究者提出神经模型通过执行多跳逻辑推理来解决复杂查询。然而,大多数现有模型无法同时在一跳和多跳查询上取得优异表现。近期工作提出了一种基于预训练神经链接预测器的逻辑消息传递机制。该机制尽管在一跳和多跳查询上均有效,却忽略了查询图中常量节点与变量节点之间的差异。此外,在节点嵌入更新阶段,该机制无法动态衡量不同消息的重要性,且其能否捕捉与节点及接收消息相关的隐式逻辑依赖关系仍不明确。本文提出条件逻辑消息传递Transformer(Conditional Logical Message Passing Transformer, CLMPT),该模型在使用预训练神经链接预测器的情况下考虑常量与变量的差异,并根据节点类型条件性地执行消息传递。实验证明,该方法能降低计算成本且不影响性能。此外,CLMPT采用Transformer聚合接收的消息并更新相应节点嵌入。通过自注意力机制,CLMPT可为包含接收消息及对应节点的输入集合中各元素分配自适应权重,并显式建模不同元素间的逻辑依赖关系。实验结果表明,CLMPT是新型最先进的神经复杂查询回答模型。