Journal article Open Access

FAIR Computational Workflows

Carole Goble; Sarah Cohen-Boulakia; Stian Soiland-Reyes; Daniel Garijo; Yolanda Gil; Michael R. Crusoe; Kristian Peters; Daniel Schober

Computational workflows describe the complex multi-step methods that are used for data collection, data preparation, analytics, predictive modelling, and simulation that lead to new data products.

They can inherently contribute to the FAIR data principles: by processing data according to established metadata; by creating metadata themselves during the processing of data; and by tracking and recording data provenance.

These properties aid data quality assessment and contribute to secondary data usage. Moreover, workflows are digital objects in their own right.

This paper argues that FAIR principles for workflows need to address their specific nature in terms of their composition of executable software steps, their provenance, and their development. 

Accepted for Data Intelligence special issue: FAIR best practices 2019. Carole Goble acknowledges funding by BioExcel2 (H2020 823830), IBISBA1.0 (H2020 730976) and EOSCLife (H2020 824087) . Daniel Schober's work was financed by Phenomenal (H2020 654241) at the initiation-phase of this effort, current work in kind contribution. Kristian Peters is funded by the German Network for Bioinformatics Infrastructure (de.NBI) and acknowledges BMBF funding under grant number 031L0107. Stian Soiland-Reyes is funded by BioExcel2 (H2020 823830). Daniel Garijo, Yolanda Gil, gratefully acknowledge support from DARPA award W911NF-18-1-0027, NIH award 1R01AG059874-01, and NSF award ICER-1740683.
Files (492.4 kB)
Name Size
FAIR Computational Workflows-accepted-20190705.pdf
md5:7a4b14781f897b2d5a6e512b0304c57d
492.4 kB Download
249
107
views
downloads
All versions This version
Views 24993
Downloads 10738
Data volume 51.0 MB18.7 MB
Unique views 20674
Unique downloads 9029

Share

Cite as