Entity linking (mapping ambiguous mentions in text to entities in a knowledge base) is a foundational step in tasks such as knowledge graph construction, question-answering, and information extraction. Our method, LELA, is a modular coarse-to-fine approach that leverages the capabilities of large language models (LLMs), and works with different target domains, knowledge bases and LLMs, without any fine-tuning phase. Our experiments across various entity linking settings show that LELA is highly competitive with fine-tuned approaches, and substantially outperforms the non-fine-tuned ones.
翻译:实体链接(将文本中的歧义提及映射到知识库中的实体)是知识图谱构建、问答和信息抽取等任务的基础步骤。我们提出的LELA方法是一种模块化的由粗到细的实体链接方法,该方法利用大语言模型(LLMs)的能力,无需任何微调阶段即可适用于不同的目标领域、知识库和大语言模型。我们在多种实体链接场景下的实验表明,LELA与经过微调的方法相比具有很强的竞争力,并且显著优于未经微调的方法。