Event Temporal Relation Extraction (ETRE) is paramount but challenging. Within a discourse, event pairs are situated at different distances or the so-called proximity bands. The temporal ordering communicated about event pairs where at more remote (i.e., ``long'') or less remote (i.e., ``short'') proximity bands are encoded differently. SOTA models have tended to perform well on events situated at either short or long proximity bands, but not both. Nonetheless, real-world, natural texts contain all types of temporal event-pairs. In this paper, we present MulCo: Distilling Multi-Scale Knowledge via Contrastive Learning, a knowledge co-distillation approach that shares knowledge across multiple event pair proximity bands to improve performance on all types of temporal datasets. Our experimental results show that MulCo successfully integrates linguistic cues pertaining to temporal reasoning across both short and long proximity bands and achieves new state-of-the-art results on several ETRE benchmark datasets.
翻译:事件时序关系抽取(ETRE)至关重要且极具挑战性。在语篇中,事件对位于不同的距离或所谓的邻近带。关于事件对的时序关系,在更远(即“长”)或较近(即“短”)的邻近带中,其编码方式有所不同。当前最先进的模型往往在位于短邻近带或长邻近带的事件上表现良好,但难以同时兼顾两者。然而,现实世界中的自然文本包含所有类型的时序事件对。本文提出MulCo:一种通过对比学习进行多尺度知识蒸馏的知识协同蒸馏方法,该方法在多个事件对邻近带之间共享知识,以提高在所有类型时序数据集上的性能。我们的实验结果表明,MulCo成功整合了跨越短和长邻近带的、与时序推理相关的语言线索,并在多个ETRE基准数据集上取得了新的最先进结果。