Crossley, Scott
Allen, Laura K.
Snow, Erica L.
McNamara, Danielle S.
2016-12-25
<p>This study investigates a novel approach to automatically assessing essay quality that combines natural language processing approaches that assess text features with approaches that assess individual differences in writers such as demographic information, standardized test scores, and survey results. The results demonstrate that combining text features and individual differences increases the accuracy of automatically assigned essay scores over using either individual differences or text features alone. The findings presented here have important implications for writing educators because they reveal that essay scoring methods can benefit from the incorporation of features taken not only from the essay itself (e.g., features related to lexical and syntactic complexity), but also from the writer (e.g., vocabulary knowledge and writing attitudes). The findings have implications for educational data mining researchers because they demonstrate new natural language processing approaches that afford the automatic assessment of performance outcomes.</p>
The file is in PDF format. If your computer does not recognize it, simply download the file and then open it with your browser.
https://doi.org/10.5281/zenodo.3554594
oai:zenodo.org:3554594
eng
Zenodo
https://jedm.educationaldatamining.org/index.php/JEDM/article/view/143
https://doi.org/10.5281/zenodo.3554593
info:eu-repo/semantics/openAccess
Creative Commons Attribution Non Commercial No Derivatives 4.0 International
https://creativecommons.org/licenses/by-nc-nd/4.0/legalcode
Journal of Educational Data Mining, 8(2), 1-19, (2016-12-25)
automated essay scoring
natural language processing
individual differences
intelligent tutoring systems
writing quality
Incorporating Learning Characteristics into Automatic Essay Scoring Models: What Individual Differences and Linguistic Features Tell Us about Writing Quality
info:eu-repo/semantics/article