The paper extends an existing Intelligent Tutoring System (ITS) that supports students' learning via AI-driven personalized hints and can generate explanations to justify why/how the hints were generated. In this work, we investigate personalizing these hint explanations to students with low levels of two traits, Need for Cognition and Conscientiousness in order to enhance their engagement with the explanations, based on prior findings that these students generally do not ask for the explanations although they would benefit from them. We evaluate the effectiveness of the personalized hint explanations with a formal user study. Our results show that the personalization increases our target users' interaction with the hint explanations, their understanding of the hints, and their learning. Hence, this work contributes to exiting initial evidence on the value of Personalized Explainable AI (PXAI) in education.
翻译:本文扩展了一个现有的智能导学系统(ITS),该系统通过AI驱动的个性化提示支持学生学习,并能生成解释以说明提示的生成原因及方式。本研究基于先前发现——尽管具有低认知需求与低尽责性特质的学生能从解释中获益,但他们通常不会主动请求解释——我们针对这两类特质水平较低的学生,对提示解释进行个性化设计,旨在提升他们对解释内容的参与度。我们通过正式的用户研究评估了个性化提示解释的有效性。结果显示,个性化设计显著增加了目标用户与提示解释的互动频率、他们对提示的理解深度以及学习成效。因此,本研究为教育领域中个性化可解释人工智能(PXAI)的价值提供了新的实证依据。