DMP Evaluation Criteria — focus on project-lifecycle, research discipline and AI-assisted evaluations
Authors/Creators
- 1. Humboldt-Universität zu Berlin
-
2.
Institute of Molecular Biology
-
3.
Forschungszentrum Jülich
-
4.
Leibniz Institute of Plant Genetics and Crop Plant Research
-
5.
Zuse Institute Berlin
-
6.
Deutsches Zentrum für Luft- und Raumfahrt e. V. (DLR)
-
7.
ZB MED - Information Centre for Life Sciences
-
8.
Deutsches Archäologisches Institut, Abteilung Athen
-
9.
University of Mannheim
-
10.
Technical University of Darmstadt
Description
This preprint presents a practical, lifecycle-based framework for evaluating Data Management Plans (DMPs), developed by the infra-dmp DMP Evaluation Criteria Working Group, operating within the Common Infrastructure Section of the National Research Data Infrastructure (NFDI). The framework addresses a key gap in research data management: the absence of clear, actionable criteria for assessing the quality, completeness, and feasibility of DMPs throughout a research project.
Central to this framework is the recognition of DMPs as living documents that evolve alongside the research they describe. Evaluation criteria are structured around three project lifecycle stages — the proposal/early stage, the mid-project stage, and the end-project stage — ensuring that assessments remain sensitive to the project's current phase and reflect the guiding question: Is the information provided sufficient for the current stage of the project? By integrating funder requirements, FAIR data principles, and discipline-specific standards, the framework enables consistent, transparent, and constructive evaluation across scientific domains.
The preprint further introduces DMP-EVA (Data Management Plan Evaluation), an open-source computational tool that operationalizes the evaluation criteria. DMP-EVA automates the process of checking DMPs against the lifecycle-based framework, offering data stewards, researchers, and proposal reviewers a instant and more objective quality assessment. This reduces the subjectivity and time burden of manual review, while improving alignment with funder expectations and best practices in research data management.
Together, the evaluation framework and its software implementation support improvements in DMP quality, reproducibility, and the long-term reusability of research data. The work is intended both as a practical resource for RDM professionals and as a foundation for community-driven development, including discipline-specific adaptations and adjustments to diverse funding contexts.
Files
DMP_Evaluation_V1.0.pdf
Files
(1.0 MB)
| Name | Size | Download all |
|---|---|---|
|
md5:178be54a5895dd7d10a219b433ea5f06
|
676.4 kB | Preview Download |
|
md5:4dbfe93b02edeb37f101c3e1b67e4fbe
|
326.5 kB | Download |
|
md5:4b9c791b8685b3a085280c4193198816
|
15.4 kB | Download |
Additional details
Dates
- Created
-
2026-04-17
Software
- Repository URL
- https://github.com/usadellab/dmp-eva
- Programming language
- JavaScript
- Development Status
- Active