Planned intervention: On Wednesday June 26th 05:30 UTC Zenodo will be unavailable for 10-20 minutes to perform a storage cluster upgrade.
Published January 11, 2024 | Version v1
Report Open

Report on FAIR Signposting and its Uptake by the Community

  • 1. ROR icon Centre for Plant Biotechnology and Genomics
  • 2. ROR icon Universidad Politécnica de Madrid
  • 3. University of Oxford
  • 4. ROR icon Data Archiving and Networked Services
  • 5. ROR icon Novo Nordisk (Denmark)
  • 6. Deutsches Zentrum für Luft- und Raumfahrt eV
  • 7. ROR icon University of Bremen
  • 8. ROR icon University of Manchester
  • 9. ROR icon University of Amsterdam
  • 10. Universität Bielefeld Universitätsbibliothek Bielefeld
  • 11. ROR icon Nantes Université
  • 12. ROR icon Inserm

Description

The FAIR Metrics subgroup of the EOSC Task Force on FAIR Metrics and Data Quality is dedicated to scrutinizing the adoption and impact of FAIRness metrics and the practicality of FAIRness evaluations. Previous reports have highlighted the inconsistency in results when the same digital object is evaluated by different tools, attributing these disparities to varied interpretations of the Metrics, metadata publishing practices, and the fundamental objectives of FAIR principles. The authors of this report, representing the FAIR Metrics subgroup, have facilitated six workshops and hackathon-style gatherings, convening diverse FAIR assessment stakeholders—including tool developers, standards and repository experts, and interoperability specialists.

The initial workshops, as outlined in an earlier report endorsed by EOSC, pinpointed a metadata publishing design pattern known as “FAIR Signposting.” This approach offers a transparent, compliant, and straightforward mechanism for guiding automated agents through metadata spaces to locate three essential FAIR elements: the globally unique identifier (GUID), the data records, and the corresponding metadata records. Furthermore, these sessions led to the development of a paradigm for creating reference environments. These environments serve as benchmarks for evaluating the compliance of metadata harvesters with FAIR Signposting criteria and for standardizing the metadata harvesting process of FAIR assessment tools.

This report provides an updated overview of the recent progress achieved by the FAIR Metrics subgroup, encapsulating the advances from the latest two hackathon events. We outline the progress towards developing an integrated suite of FAIR assessment tests, which are now becoming standardized across various FAIR assessment instruments. Additionally, we present preliminary insights into the community's adoption of these practices and report on the ongoing effort to establish a FAIR testing governance body, along with strategies to ensure its enduring viability.

We, the authors, advocate for the EOSC Association and the EOSC Task Force on Long-term Data Preservation to endorse FAIR Signposting, along with the outcomes of our endeavours, as formal recommendations for EOSC-linked resources and EOSC-funded initiatives. We aim to promote an EOSC infrastructure where FAIR-enabling services can be assessed with clarity and uniformity. This will also ensure that tools designed for the EOSC will function with a standardized set of expectations applicable to all stakeholders - providers and users.

Files

Second Report on FAIR Signposting events.pdf

Files (640.8 kB)

Name Size Download all
md5:265fa17ad288a2b2fc3f6004c35257aa
640.8 kB Preview Download