Published July 18, 2022 | Version v1
Conference paper Open

Using Machine Learning Explainability Methods to Personalize Interventions for Students

Contributors

  • 1. University of Canterbury, NZ
  • 2. University of Illinois Urbana–Champaign, US

Description

Machine learning is a powerful method for predicting the outcomes of interactions with educational software, such as the grade a student is likely to receive. However, a predicted outcome alone provides little insight regarding how a student's experience should be personalized based on that outcome. In this paper, we explore a generalizable approach for resolving this issue by personalizing learning using explanations of predictions generated via machine learning explainability methods. We tested the approach in a self-guided, self-paced online learning system for college-level introductory statistics topics that provided personalized interventions for encouraging self-regulated learning behaviors. The system used explanations generated by SHAP (SHapley Additive exPlanations) to recommend specific actions for students to take based on features that most negatively influenced predicted learning outcomes; an "expert system" comparison condition provided recommendations based on predefined rules. A randomized controlled trial of 73 participants (37 expert-system condition, 36 explanation condition) revealed similar learning and topic-choosing behavior between conditions, suggesting that XAI-informed interventions facilitated student statistics learning to a similar degree as expert-system interventions.

Files

2022.EDM-short-papers.43.pdf

Files (792.9 kB)

Name Size Download all
md5:d8c2df65e48fc4250b5af9a2435a54b9
792.9 kB Preview Download

Additional details