We describe a system for building task-oriented dialogue systems combining the in-context learning abilities of large language models (LLMs) with the deterministic execution of business logic. LLMs are used to translate between the surface form of the conversation and a domain-specific language (DSL) which is used to progress the business logic. We compare our approach to the intent-based NLU approach predominantly used in industry today. Our experiments show that developing chatbots with our system requires significantly less effort than established approaches, that these chatbots can successfully navigate complex dialogues which are extremely challenging for NLU-based systems, and that our system has desirable properties for scaling task-oriented dialogue systems to a large number of tasks. We make our implementation available for use and further study.
翻译:我们描述了一个构建任务型对话系统的框架,该框架将大语言模型(LLMs)的上下文学习能力与业务逻辑的确定性执行相结合。LLMs用于在对话表面形式与推动业务逻辑进展的领域特定语言(DSL)之间进行转换。我们将该方法与当前工业界广泛采用的基于意图的自然语言理解(NLU)方法进行了比较。实验表明,相较传统方法,基于本系统开发聊天机器人所需工作量显著降低;这些聊天机器人能够成功处理对NLU系统极具挑战性的复杂对话;同时,本系统在将任务型对话扩展至大量任务方面具备理想特性。我们已公开实现代码供使用和进一步研究。