Planned intervention: On Wednesday April 3rd 05:30 UTC Zenodo will be unavailable for up to 2-10 minutes to perform a storage cluster upgrade.
Published May 20, 2022 | Version v1
Dataset Open

Data from: Area 2 of primary somatosensory cortex encodes kinematics of the whole arm

  • 1. University of Pittsburgh
  • 2. Columbia University
  • 3. Northwestern University

Description

Proprioception, the sense of body position, movement, and associated forces, remains poorly understood, despite its critical role in movement. Most studies of area 2, a proprioceptive area of somatosensory cortex, have simply compared neurons' activities to the movement of the hand through space. By using motion tracking, we sought to elaborate this relationship by characterizing how area 2 activity relates to whole arm movements. We found that a whole-arm model, unlike classic models, successfully predicted how features of neural activity changed as monkeys reached to targets in two workspaces. However, when we then evaluated this whole-arm model across active and passive movements, we found that many neurons did not consistently represent the whole arm over both conditions. These results suggest that 1) neural activity in area 2 includes representation of the whole arm during reaching and 2) many of these neurons represented limb state differently during active and passive movements.

Notes

This data set includes behavioral recordings and extracellular neural recordings from area 2 of primary somatosensory cortex of Rhesus macaques during two separate reaching experiments. Raeed Chowdhury collected and processed the data in the laboratory of Lee Miller for use in Chowdhury et al. 2019 (accepted in eLife as of 12/2019), which characterized how area 2 neurons represent reaching movements. Results and methodology from these experiments are described in [1].

In both experiments, monkeys controlled a cursor on a screen using a two link, planar manipulandum. In the first experiment, from which we include eight total sessions, monkeys reached to sequential, visually presented targets in one of two workspaces: one near the body on the contralateral side to the reaching arm and one far from the body on the ipsilateral side. In the second experiment, from which we include four total sessions, monkeys performed a simple center-out task, where on some random trials during the center-hold period, the manipulandum applied a perturbation to the monkey's hand. During these reaching tasks, we tracked the locations of ten markers on the monkey's arm, used to estimate joint angles and muscle lengths during the behavioral experiments. In addition to the behavioral data, we collected neural data from area 2 using Blackrock Utah multielectrode arrays, yielding ~100 channels of extracellular recordings per monkey. Recordings from these channels were thresholded online to detect spikes, which were sorted offline into putative single units.

In addition to the data from these experiments, we have also included data from several sensory mapping sessions with the three monkeys, where we characterized the sensory receptive fields of several electrodes on the arrays.

Analysis code used to produce figures for [1] provides useful examples for how to work with this dataset. See https://github.com/raeedcho/s1-kinematics.git for code and readme.

If you publish any work using the data, please cite the publication above ([1] Chowdhury et. al, 2019) and also cite this data set.

Funding provided by: National Science Foundation
Crossref Funder Registry ID: http://dx.doi.org/10.13039/100000001
Award Number: DGE-1324585

Funding provided by: National Institute of Neurological Disorders and Stroke
Crossref Funder Registry ID: http://dx.doi.org/10.13039/100000065
Award Number: R01 NS095251

Files

s1-kinematics.zip

Files (4.0 GB)

Name Size Download all
md5:d414fb51f3aac6c0b44d6f3b639a92c6
4.0 GB Preview Download

Additional details

Related works

Is cited by
10.7554/eLife.48198 (DOI)