Acronym Disambiguation (AD) is a fundamental challenge in technical text processing, particularly in specialized sectors where high ambiguity complicates automated analysis. This paper addresses AD within the context of the TextMine'26 competition on French railway documentation. We present DACE (Dynamic Prompting, Retrieval Augmented Generation, Contextual Selection, and Ensemble Aggregation), a framework that enhances Large Language Models through adaptive in-context learning and external domain knowledge injection. By dynamically tailoring prompts to acronym ambiguity and aggregating ensemble predictions, DACE mitigates hallucination and effectively handles low-resource scenarios. Our approach secured the top rank in the competition with an F1 score of 0.9069.
翻译:缩略词消歧是技术文本处理中的一项基础性挑战,在专业领域中尤为突出,其高度歧义性给自动化分析带来了困难。本文针对TextMine'26竞赛中关于法国铁路文档的缩略词消歧任务展开研究。我们提出了DACE框架(动态提示、检索增强生成、上下文选择与集成聚合),该框架通过自适应上下文学习与外部领域知识注入来增强大语言模型。DACE通过动态调整提示以适应缩略词歧义,并集成多个模型的预测结果,有效缓解了幻觉问题,并在低资源场景下表现优异。我们的方法在竞赛中以0.9069的F1分数获得了最高排名。