Poster Open Access

Checklist Strategies to Improve the Reproducibility of Deep Learning Experiments with an Illustration

Ali Ben Abbes; Jeaneth Machicao; Leonardo Meneguzzi; Pedro Pizzigatti Corrêa; Alison Specht; Romain David; Gérard Subsol; Danton Ferreira Vellenich; Rodolphe Devillers; Shelley Stall; Nicolas Mouquet; Marc Chaumont; Laure Berti-Equille; David Mouillot

The challenges of Reproducibility and Replicability (R&R) have become a focus of attention in order to promote open and accessible research. Therefore,  efforts have been made to develop good practices for R&R in the area of computer science. Nevertheless,  Deep Learning (DL) based experiments remain difficult to reproduce by others due to the complexity of these techniques. In addition, several challenges concern the use of massive and heterogeneous data that contribute to the complexity of this R&R. Firstly, we compiled three different aspects to help researchers to improve R&R. This compilation was based on machine learning checklists, guidelines, and principles from FAIR. Therefore, this compilation is useful for a (1) researcher seeking to reproduce a paper, (2) an author reporting on an experiment, and (3) a reviewer seeking to qualify the scientific contributions of the work. Secondly, we illustrate the compilation of three recent DL experiments for socio-economic estimation using remotely sensed data.

Poster to be presented during RDA 19th Plenary Meeting, Part Of International Data Week, 20–23 June 2022, Seoul, South Korea

Acknowledgments: The PARSEC project is funded by the Belmont Forum, Collaborative Research Action on Science-Driven e-Infrastructures Innovation. J.M. is grateful for the support from FAPESP (grant 2020/03514–9).
127
73
views
downloads
All versions This version
Views 127127
Downloads 7373
Data volume 86.6 MB86.6 MB
Unique views 110110
Unique downloads 6666

Share

Cite as