Manifestly and logically displaying the line of reasoning from evidence to answer is significant to explainable question answering (QA). The entailment tree exhibits the lines structurally, which is different from the self-explanation principle in large-scale language models. Existing methods rarely consider the semantic association of sentences between and within hierarchies within the tree structure, which is prone to apparent mistakes in combinations. In this work, we propose an architecture of integrating the Hierarchical Semantics of sentences under the framework of Controller-Generator (HiSCG) to explain answers. The HiSCG designs a hierarchical mapping between hypotheses and facts, discriminates the facts involved in tree constructions, and optimizes single-step entailments. To the best of our knowledge, We are the first to notice hierarchical semantics of sentences between the same layer and adjacent layers to yield improvements. The proposed method achieves comparable performance on all three settings of the EntailmentBank dataset. The generalization results on two out-of-domain datasets also demonstrate the effectiveness of our method.
翻译:显式且逻辑地展示从证据到答案的推理路径对于可解释问答至关重要。蕴含树以结构化方式呈现这些推理路径,这与大规模语言模型中的自解释原理不同。现有方法很少考虑树结构内部及层级间句子的语义关联,容易在组合过程中产生明显错误。本研究提出一种在控制器-生成器框架下整合句子层次语义的架构,用于解释答案。该架构设计了假设与事实之间的层次映射,识别参与树构建的事实,并优化单步蕴含生成。据我们所知,我们首次关注到同层及相邻层间句子的层次语义对性能提升的作用。所提方法在EntailmentBank数据集的全部三种设定下均取得可比性能。在两个域外数据集上的泛化结果也验证了本方法的有效性。