Published December 28, 2022 | Version v1
Dataset Open

Representations of color and form in mouse visual cortex

  • 1. The University of Texas at Austin

Description

Spatial transitions in color can aid any visual perception task, and its neural representation – an "integration of color and form" – is thought to begin at primary visual cortex (V1). Color and form integration is untested in mouse V1, yet studies show that the ventral retina provides the necessary substrate from green-sensitive rods and UV-sensitive cones. Here, we used two-photon imaging in V1 to measure spatial frequency (SF) tuning along four axes of rod and cone contrast space, including luminance and color. We first reveal that V1 has similar responsiveness to luminance and color, yet average SF tuning is significantly shifted lowpass for color. Next, guided by linear models, we used SF tuning along all four color axes to estimate the proportion of neurons that fall into classic models of color opponency – "single-", "double-", and "non-opponent". Few neurons (~6%) fit the criteria for double-opponency, which are uniquely tuned for chromatic borders. Most of the population can be described as a unimodal distribution ranging from strongly single-opponent to non-opponent. Consistent with recent studies of the rodent and primate retina, our V1 data is well-described by a simple model in which ON and OFF channels to V1 sample the photoreceptor mosaic randomly.

Notes

See uploaded README files for details.  Below is the top of README_for_dataset.doc.

This describes the uploaded data set used in Rhim and Nauhaus: "Joint representations of color and form in mouse visual cortex described by random pooling from rods and cones". It is a MATLAB .mat file, where each structure pertains to a given figure. In addition to the source data for the figures, it also has the following additions:

  1. The same data set, but prior to culling the population according to the dashed box in the Figure 2 scatter plot. See variables appended with "…_all"

  2. Region-of-interest ID associated with each neuron.

Below is all the information in README_for_simulations.doc.

To run the simulations for Figures 1,6,7, and 8, execute the cells in the high-level scripts of the following: Figure_1.m, Figure_6_7.m, Figure_8.m. Make sure all the other .m files are in your path.

Funding provided by: National Institutes of Health
Crossref Funder Registry ID: http://dx.doi.org/10.13039/100000002
Award Number: R01EY028657

Funding provided by: Whitehall Foundation
Crossref Funder Registry ID: http://dx.doi.org/10.13039/100001391
Award Number:

Files

Files (148.8 kB)

Name Size Download all
md5:fc90de115c4b12859e84b339c24cc62c
16.6 kB Download
md5:5f88317055cc98d21e0abfb72d6c9604
132.2 kB Download

Additional details

Related works

Is cited by
10.1101/2021.07.26.453648 (DOI)
Is derived from
10.5281/zenodo.7489604 (DOI)