Self-Service Learning Analytics (SSLA) tools aim to support educational stakeholders in creating learning analytics indicators without requiring technical expertise. While such tools promise user control and trans- parency, their effectiveness and adoption depend critically on usability aspects. This paper presents a compre- hensive usability evaluation and improvement of the Indicator Editor, a no-code, exploratory SSLA tool that enables non-technical users to implement custom learning analytics indicators through a structured workflow. Using an iterative evaluation approach, we conduct an exploratory qualitative user study, usability inspections of high-fidelity prototypes, and a workshop-based evaluation in an authentic educational setting with n = 46 students using standardized instruments, namely System Usability Scale (SUS), User Experience Question- naire (UEQ), and Net Promoter Score (NPS). Based on the evaluation findings, we derive concrete design implications that inform improvements in workflow guidance, feedback, and information presentation in the Indicator Editor. Furthermore, our evaluation provides practical insights for the design of usable SSLA tools.
翻译:暂无翻译