Student commitment towards a learning recommendation is not separable from their understanding of the reasons it was recommended to them; and their ability to modify it based on that understanding. Among explainability approaches, chatbots offer the potential to engage the student in a conversation, similar to a discussion with a peer or a mentor. The capabilities of chatbots, however, are still not sufficient to replace a human mentor, despite the advancements of generative AI (GenAI) and large language models (LLM). Therefore, we propose an approach to utilize chatbots as mediators of the conversation and sources of limited and controlled generation of explanations, to harvest the potential of LLMs while reducing their potential risks at the same time. The proposed LLM-based chatbot supports students in understanding learning-paths recommendations. We use a knowledge graph (KG) as a human-curated source of information, to regulate the LLM's output through defining its prompt's context. A group chat approach is developed to connect students with human mentors, either on demand or in cases that exceed the chatbot's pre-defined tasks. We evaluate the chatbot with a user study, to provide a proof-of-concept and highlight the potential requirements and limitations of utilizing chatbots in conversational explainability.
翻译:学生对学习推荐的投入程度,与其对推荐原因的理解以及基于该理解修改推荐的能力密不可分。在可解释性方法中,聊天机器人提供了类似同伴或导师讨论的对话潜力。然而,尽管生成式人工智能(GenAI)和大语言模型(LLM)取得了进展,聊天机器人的能力仍不足以完全替代人类导师。因此,我们提出一种方法,将聊天机器人用作对话的中介和有限受控解释生成的来源,以在发挥LLM潜力的同时降低其潜在风险。所提出的基于LLM的聊天机器人帮助学生理解学习路径推荐。我们使用知识图谱(KG)作为人工策划的信息源,通过定义提示的上下文来调控LLM的输出。开发了一种群聊方法,在需要时或超出聊天机器人预定义任务的情况下,将学生与人类导师连接起来。我们通过用户研究对聊天机器人进行评估,以提供概念验证并突出在对话可解释性中利用聊天机器人的潜在需求和局限性。