Published August 15, 2019 | Version 1.0.0
Dataset Open

Dataset: Feedback contribution to surface motion perception in the human early visual cortex

  • 1. Maastricht University
  • 2. University Health Network

Description

Dataset

Dataset accompanying the manuscript "Feedback contribution to surface motion perception in the human early visual cortex" (biorxiv).

Description

fMRI data are arrange by subject (following BIDS convention). For each subject, there are subfolders for anatomical and functional MRI data.

├── sub-01
│   ├── anat
│   │   └── ...
│   ├── func
│   │   └── ...
│   ├── func_se
│   │   └── ...
│   └── func_se_op
│       └── ...

The subfolder 'anat' contains four images from the MP2RAGE sequence (among these, T1 and proton-density weighted images). The subfolder 'func' contains the functional data (GE EPI, T2* weighted) from the main experiment (i.e. the data from which the haemodynamic response was estimated, and on which statistical analysis was performed). The subfolders 'func_se' and 'func_se_op' contain SE EPI images with opposite phase encode polarity that were used for distortion correction. Moreover, for each image/timeseries there is a json file with metadata.

Anatomical images have been masked anteriorly (defaced). Functional images are in coronal oblique orientation, covering early visual cortex.

The folder 'stimuli' contains information on the stimuli used for retinotopic mapping, including timecourse models used for population receptive field mapping. (These files are included here because of their relatively large file size, which would make distribution via a git repository impractical.) The software used for the presentation of retinotopic mapping stimuli (and for the corresponding analysis) is available on github.

For example videos of the main experimental stimuli, see zenodo.2583017. If you would like to reproduce the experimental stimuli, the respective PsychoPy code can be found on github.

The exact timing of events during the experiments (rest & stimulus blocks, target events) can be found in FSL-style design matrices ("3 column format") on github.com/ingo-m/PacMan/tree/master/analysis/FSL_MRI_Metadata.

Analysis

The analysis pipeline makes use of several MRI software packages (such as SPM and FSL for preprocessing, and CBS tools for cortical depth sampling). In order to facilitate reproducibility, the entire analysis was containerised using docker. Because of licensing issues, the docker images with the third-party software cannot be directly made available. However, the docker files and detailed instructions for the creation of the docker images are available on github.

If you would like to reproduce the analysis, the first step will be to create the docker images (which provide an exact copy of the system environment that was used to conduct the published analysis). There are two docker images, one for the main analysis (motion correction, distortion correction, GLM fitting; named "dockerimage_pacman_jessie"), and another one for the depth sampling (named "dockerimage_cbs"). Detailed instructions on how to create the docker images can be found here and here.

Once you set up the docker images, the analysis can be run automatically. For each subject, there is one parent script for the main analysis (e.g. ~/analysis/20180118/metascript_01.sh for subject 20180118) and a separate script for the depth sampling (e.g. ~/analysis/20180118/metascript_03.sh). The only manual adjustments you should have to perform to reproduce the analysis is to change the file paths in the first section of these scripts ('pacman_anly_path' is the parent directory containing the analysis code, i.e. the git repository, and 'pacman_data_path' is the parent directory containing the MRI data). The main analysis (metascript_01.sh) should take about 24 h per subject on a workstation with 12 cores, and the depth sampling (metascript_02.sh) about 2 h. The analysis can be run on consumer-grade hardware, but some parts of the analysis may not run with less than 16 GB of RAM (recommended: 32 GB).

Visualisations (e.g. cortical depth profiles and signal timecourses) and group-level statistical tests are implemented in py_depthsampling.

Further resources

Please refer to the research paper for more details: https://doi.org/10.1101/653626

The analysis pipeline can be found on https://github.com/ingo-m/PacMan

A separate repository contains the code used for visualisation of depth-sampling results: https://github.com/ingo-m/py_depthsampling/tree/PacMan

Free & open source software package for population receptive field mapping: https://github.com/ingo-m/pyprf

 

Files

stimuli.zip

Files (28.4 GB)

Name Size Download all
md5:955dc20b9900d370e69af81d5fa6b78b
374.8 MB Preview Download
md5:d35f5645c23293750c737e986667e4cd
3.9 GB Preview Download
md5:8870105ea0225ed13cdefe320ad1446b
3.3 GB Preview Download
md5:a44e9b5e3c170d54e71bea15c299b551
3.2 GB Preview Download
md5:acca86c1309fb0acfc9fbf807f087a31
3.3 GB Preview Download
md5:5b0e7f5169d49e00cb6efd5be1155da5
3.4 GB Preview Download
md5:fd1772cff11510c29b9e95e2fc2a4338
3.6 GB Preview Download
md5:9a8a662b60fdaff9473ab8630993307f
3.6 GB Preview Download
md5:c161f3c38e74339d367dc47071d7f322
3.7 GB Preview Download

Additional details

Related works

Is supplement to
10.1101/653626 (DOI)
Is supplemented by
10.5281/zenodo.2583017 (DOI)