Project deliverable Open Access

D4.1 Draft Recommendations on Requirements for Fair Datasets in Certified Repositories

Devaraju, Anusuriya; Herterich, Patricia

Other(s)
Fankhauser, Eliane; L'Hours, Herve; Davidson, Joy; von Stein, Ilona; de Vries, Jerry; Diepenbroek, Michael; Ashley, Kevin; Mokrane, Mustapha; Huigen, Frans

The overall goal of FAIRsFAIR is to accelerate the realization of the goals of the European Open Science Cloud (EOSC) by compiling and disseminating all knowledge, expertise, guidelines, implementations, new trajectories, training and education on FAIR matters. FAIRsFAIR work package 4 (WP4) will support the provision of practical solutions for implementing the FAIR principles through the co-development and implementation of certification schemes for trusted data repositories enabling FAIR research data in the EOSC, and the provision of organizational support and outreach activities. 

One of the objectives of WP4 is to develop requirements (e.g., metrics) and tools to pilot the FAIR assessment of digital objects, in particular research data objects in trustworthy digital repositories (TDRs). This report presents the first results of work carried out towards achieving the objective. We outline the context for our activities by summarising related work both performed in other work packages within FAIRsFAIR and approaches from the wider community to address FAIR data assessment. We introduce a range of scenarios for assessing data objects for FAIRness before or after deposit in data repositories and outline two primary use cases that we want to focus on in the project:

  • A trustworthy data repository will offer a  manual self-assessment tool to educate and raise awareness of researchers on making their data FAIR before depositing the data into the repository, and  

  • A trustworthy data repository committed to FAIR data provision wants to programmatically assess datasets for their level of FAIRness over time. To facilitate this, FAIRsFAIR will develop an automated assessment for published datasets that will be piloted with some of the repositories selected for in-depth collaboration as part of the FAIRsFAIR open calls.

In addition, we present a set of preliminary metrics corresponding to FAIR principles that can be used to assess data objects through manual and automated testing. We discuss the development and key aspects of the metrics, including their initial alignments with the existing CoreTrustSeal requirements. The alignment forms a basis to develop the FAIR elaboration of CoreTrustSeal requirements, which is one of main ongoing activities of WP4. Furthermore, we present draft requirements that any FAIR assessment implementation will need to consider and highlight how those requirements will impact the use cases for FAIR assessment that our upcoming work will address. We conclude by outlining the next steps in our work to iteratively improve the requirements through a number of pilots. Our priorities include the refinement of the suggested metrics based on the feedback elicited during pilot testing with several communities, in the context of the use cases developed.

This is the draft version of the deliverable not yet approved by the European Commission. Though we can't undertake to respond to every comment directly, we are seeking wide feedback on this deliverable which will inform discussions and further work within FAIRsFAIR as well as collaborations with other relevant projects. Comments and suggestions can be added until 01.07.2020 at https://drive.google.com/file/d/1dEKvKVqQGZ7Kvejeo7UYjquWMcTg-GtK/view?usp=sharing
211
190
views
downloads
All versions This version
Views 211211
Downloads 190190
Data volume 269.4 MB269.4 MB
Unique views 183183
Unique downloads 162162

Share

Cite as