Info: Zenodo’s user support line is staffed on regular business days between Dec 23 and Jan 5. Response times may be slightly longer than normal.

Published July 5, 2023 | Version v1
Conference paper Open

How to Open Science: Debugging Reproducibility within the Educational Data Mining Conference

  • 1. WestEd, USA
  • 2. EPFL, Switzerland
  • 3. Google Research and Indian Institute of Science, India

Description

Despite increased efforts to assess the adoption rates of open science and robustness of reproducibility in sub-disciplines of education technology, there is a lack of understanding of why some research is not reproducible. Prior work has taken the first step toward assessing reproducibility of research, but has assumed certain constraints which hinder its discovery. Thus, the purpose of this study was to replicate previous work on papers within the proceedings of the International Conference on Educational Data Mining to accurately report on which papers are reproducible and why. Specifically, we examined 208 papers, attempted to reproduce them, documented reasons for reproducibility failures, and asked authors to provide additional information needed to reproduce their study. Our results showed that out of 12 papers that were potentially reproducible, only one successfully reproduced all analyses, and another two reproduced most of the analyses. The most common failure for reproducibility was failure to mention libraries needed, followed by non-seeded randomness.

Files

2023.EDM-long-papers.10.pdf

Files (362.2 kB)

Name Size Download all
md5:e5cc65d7a15cd55c199aea3775f62e73
362.2 kB Preview Download