Preprint Open Access

Automatic Classification of Error Types in Solutions to Programming Assignments at Online Learning Platform

Artyom Lobanov; Timofey Bryksin; Alexey Shpilman


MARC21 XML Export

<?xml version='1.0' encoding='UTF-8'?>
<record xmlns="http://www.loc.gov/MARC21/slim">
  <leader>00000nam##2200000uu#4500</leader>
  <datafield tag="041" ind1=" " ind2=" ">
    <subfield code="a">eng</subfield>
  </datafield>
  <datafield tag="653" ind1=" " ind2=" ">
    <subfield code="a">MOOC, automatic evaluation, clustering, classification, program- ming</subfield>
  </datafield>
  <controlfield tag="005">20200120172445.0</controlfield>
  <controlfield tag="001">2631872</controlfield>
  <datafield tag="700" ind1=" " ind2=" ">
    <subfield code="u">JetBrains Research, Saint Petersburg State University</subfield>
    <subfield code="0">(orcid)0000-0001-9022-3563</subfield>
    <subfield code="a">Timofey Bryksin</subfield>
  </datafield>
  <datafield tag="700" ind1=" " ind2=" ">
    <subfield code="u">JetBrains Research, Higher School of Economics</subfield>
    <subfield code="a">Alexey Shpilman</subfield>
  </datafield>
  <datafield tag="856" ind1="4" ind2=" ">
    <subfield code="s">173967</subfield>
    <subfield code="z">md5:6e2a0721358da4d6b2e5666ec1a48ede</subfield>
    <subfield code="u">https://zenodo.org/record/2631872/files/paper.pdf</subfield>
  </datafield>
  <datafield tag="542" ind1=" " ind2=" ">
    <subfield code="l">open</subfield>
  </datafield>
  <datafield tag="260" ind1=" " ind2=" ">
    <subfield code="c">2019-04-07</subfield>
  </datafield>
  <datafield tag="909" ind1="C" ind2="O">
    <subfield code="p">openaire</subfield>
    <subfield code="o">oai:zenodo.org:2631872</subfield>
  </datafield>
  <datafield tag="100" ind1=" " ind2=" ">
    <subfield code="u">JetBrains Research, Higher School of Economics</subfield>
    <subfield code="a">Artyom Lobanov</subfield>
  </datafield>
  <datafield tag="245" ind1=" " ind2=" ">
    <subfield code="a">Automatic Classification of Error Types in Solutions to Programming Assignments at Online Learning Platform</subfield>
  </datafield>
  <datafield tag="540" ind1=" " ind2=" ">
    <subfield code="u">https://creativecommons.org/licenses/by/4.0/legalcode</subfield>
    <subfield code="a">Creative Commons Attribution 4.0 International</subfield>
  </datafield>
  <datafield tag="650" ind1="1" ind2="7">
    <subfield code="a">cc-by</subfield>
    <subfield code="2">opendefinition.org</subfield>
  </datafield>
  <datafield tag="520" ind1=" " ind2=" ">
    <subfield code="a">&lt;p&gt;&amp;nbsp;Online programming courses are becoming more and more popular, but they still have significant drawbacks when compared to the&lt;br&gt;
traditional education system, e.g., the lack of feedback. In this study, we apply machine learning methods to improve the feedback of automated verification systems for programming assignments. We propose an approach that provides an insight on how to fix the code for a given incorrect submission. To achieve this, we detect frequent error types by clustering previously submitted incorrect solutions, label these clusters and use this labeled dataset to identify the type of an error in a new submission. We examine and compare several approaches to the detection of frequent error types and to the assignment of clusters to new submissions. The proposed method is evaluated on a dataset provided by a popular online learning platform.&lt;/p&gt;</subfield>
  </datafield>
  <datafield tag="773" ind1=" " ind2=" ">
    <subfield code="n">doi</subfield>
    <subfield code="i">isVersionOf</subfield>
    <subfield code="a">10.5281/zenodo.2631871</subfield>
  </datafield>
  <datafield tag="024" ind1=" " ind2=" ">
    <subfield code="a">10.5281/zenodo.2631872</subfield>
    <subfield code="2">doi</subfield>
  </datafield>
  <datafield tag="980" ind1=" " ind2=" ">
    <subfield code="a">publication</subfield>
    <subfield code="b">preprint</subfield>
  </datafield>
</record>
146
117
views
downloads
All versions This version
Views 146146
Downloads 117117
Data volume 20.4 MB20.4 MB
Unique views 138138
Unique downloads 109109

Share

Cite as