Auto-scoring Student Responses with Images in Mathematics
Authors/Creators
Contributors
Editors:
- 1. WestEd, USA
- 2. EPFL, Switzerland
- 3. Google Research and Indian Institute of Science, India
Description
Teachers often rely on the use of a range of open-ended problems to assess students' understanding of mathematical concepts. Beyond traditional conceptions of student open-ended work, commonly in the form of textual short-answer or essay responses, the use of figures, tables, number lines, graphs, and pictographs are other examples of open-ended work common in mathematics. While recent developments in areas of natural language processing and machine learning have led to automated methods to score student open-ended work, these methods have largely been limited to textual answers. Several computer-based learning systems allow students to take pictures of hand-written work and include such images within their answers to open-ended questions. With that, however, there are few-to-no existing solutions that support the auto-scoring of student hand-written or drawn answers to questions. In this work, we build upon an existing method for auto-scoring textual student answers and explore the use of OpenAI/CLIP, a deep learning embedding method designed to represent both images and text, as well as Optical Character Recognition (OCR) to improve model performance. We evaluate the performance of our method on a dataset of student open-responses that contains both text- and image-based responses, and find a reduction of model error in the presence of images when controlling for other answer-level features.
Files
2023.EDM-short-papers.36.pdf
Files
(521.3 kB)
| Name | Size | Download all |
|---|---|---|
|
md5:bb459051f1fe4e12d130bf0cab30fa65
|
521.3 kB | Preview Download |