Published August 28, 2020 | Version v2
Presentation Open

Enabling reproducible ML and Systems research: the good, the bad, and the ugly

  • 1. cTuning foundation, cKnowledge SAS


Invited talk at FastPath 2020 (International Workshop on Performance Analysis of Machine Learning Systems) co-located with ISPASS 2020.


10 years ago we released our ML-based MILEPOST compiler with all related code and experimental data at Unfortunately, this research quickly stalled after we struggled to reproduce performance results and predictive models shared by volunteers across rapidly changing systems.

In this talk, I will describe my 10-year effort to solve numerous reproducibility issues in ML&systems research. I will share my experience reproducing 150+ systems and ML papers during artifact evaluation at ASPLOS, MLSys, CGO, PPoPP and Supercomputing. This tedious experience motivated me to develop the Collective Knowledge framework and the open portal to bring DevOps principles to our research. I will also present cKnowledge solutions - a new way to package and share research artifacts and results with common Python APIs, CLI actions, portable workflows and JSON meta descriptions. Such solutions can be used to automatically build, benchmark and validate ML&system experiments across continuously evolving platforms.

I will conclude with several practical use-cases of our technology in collaboration with Arm, IBM, General Motors, the Raspberry Pi foundation and MLPerf. Our long-term goal is to help researchers share their new ML techniques as production-ready packages along with published papers and participate in collaborative and reproducible benchmarking, co-design and comparison of efficient ML/software/hardware stacks.




Files (4.3 MB)

Name Size Download all
4.3 MB Preview Download

Additional details


  • Grigori Fursin (2020). The Collective Knowledge project: making ML models more portable and reproducible with open APIs, reusable best practices and MLOps. arXiv:2006.07161