Published September 2, 2020 | Version v1
Presentation Open

FAIR Computational Workflows

  • 1. Université Paris-Sud
  • 1. Department of Computer Science, The University of Manchester
  • 2. Information Sciences Institute, University of Southern California
  • 3. Common Workflow Language project
  • 4. Leibniz Institute of Plant Biochemistry (IPB Halle), Department of Biochemistry of Plant Interactions

Description

Computational workflows describe the complex multi-step methods that are used for data collection, data preparation, analytics, predictive modelling, and simulation that lead to new data products. They can inherently contribute to the FAIR data principles: by processing data according to established metadata; by creating metadata themselves during the processing of data; and by tracking and recording data provenance. These properties aid data quality assessment and contribute to secondary data usage. Moreover, workflows are digital objects in their own right.

This is a presentation of the paper FAIR Computational Workflows, published in Data Intelligence. The paper argues that FAIR principles for workflows need to address their specific nature in terms of their composition of executable software steps, their provenance, and their development.

Presented at ECCB 2020 Workshop on FAIR Computational Workflows.

Files

FAIRWorkflows.pdf

Files (1.9 MB)

Name Size Download all
md5:cd2fce30b97b050968127cc4f87a008b
1.9 MB Preview Download

Additional details

Related works

Cites
Journal article: 10.1038/sdata.2016.18 (DOI)
Is supplement to
Journal article: 10.1162/dint_a_00033 (DOI)