Monitoring in-class programming exercises can help instructors identify struggling students and common challenges. However, understanding students' progress can be prohibitively difficult, particularly for multi-faceted problems that include multiple steps with complex interdependencies, have no predictable completion order, or involve evaluation criteria that are difficult to summarize across many students (e.g., exercises building interactive web-based user interfaces). We introduce SPARK, a coding exercise monitoring dashboard designed to address these challenges. SPARK allows instructors to flexibly group substeps into checkpoints based on exercise requirements, suggests automated tests for these checkpoints, and generates visualizations to track progress across steps. SPARK also allows instructors to inspect intermediate outputs, providing deeper insights into solution variations. We also construct a dataset of 40-minute keystroke coding data from N=22 learners solving two web programming exercises and provide empirical insights into the perceived usefulness of SPARK through a within-subjects evaluation with 16 programming instructors.
翻译:监控课堂编程练习有助于教师识别遇到困难的学生及常见挑战。然而,理解学生的进展可能异常困难,特别是对于多层面的问题——这类问题包含具有复杂相互依赖关系的多个步骤、没有可预测的完成顺序,或者涉及难以跨众多学生进行总结的评价标准(例如构建基于网页的交互式用户界面的练习)。我们推出了SPARK,这是一个旨在应对这些挑战的编程练习监控仪表板。SPARK允许教师根据练习要求灵活地将子步骤分组为检查点,为这些检查点建议自动化测试,并生成可视化图表以跟踪各步骤的进度。SPARK还支持教师检查中间输出,从而更深入地洞察解决方案的变体。我们还构建了一个包含N=22名学习者解决两个网页编程练习时产生的40分钟击键编码数据集,并通过一项涉及16名编程教师的被试内评估,提供了关于SPARK感知有用性的实证见解。