Published November 2, 2020 | Version v1
Dataset Open

Entorhinal-retrosplenial circuits for allocentric-egocentric transformation of boundary coding

  • 1. Max Planck Institute for Brain Research
  • 2. Goethe University Frankfurt

Description

Spatial navigation requires landmark coding from two perspectives, relying on viewpoint-invariant and self-referenced representations. The brain encodes information within each reference frame, but their interactions and functional dependency remains unclear. Here we investigate the relationship between neurons in rat retrosplenial cortex (RSC) and entorhinal cortex (MEC) that increase firing near boundaries of space. Border cells in RSC specifically encode walls, but not objects, and are sensitive to the animal's direction to nearby borders. These egocentric representations are generated independent of visual or whisker sensation, but depend on inputs from MEC that contains allocentric spatial cells. Pharmaco- and optogenetic inhibition of MEC cells led to a disruption of border coding in RSC, but not vice versa, indicating allocentric-to-egocentric transformation. Finally, RSC border cells fire prospective to the animal's next motion, unlike those in MEC, revealing the MEC-RSC pathway as an extended border coding circuit that implements coordinate transformation to guide navigation behavior.

Notes

We refer the reader to the associated paper for details on behavioural data acquisition, as well as a detailed account on how data vectors were generated. These data files include the following data variables for all individual neurons:

  • Animal ID: animal identifier that corresponds with those mentioned in figure legends of the paper.
  • Session ID: session identifier that corresponds to the recording day number.
  • EMD scores: Earth Mover's Distance (EMD), the main metric for border cell classification.
  • Global FR: Overall spiking rate in spikes/sec, calculated as the total number of spikes, divided by the duration of a recording session.
  • Spatial rate map: Firing rate map in 2D space.
  • Boundary rate maps: Firing rate map in egocentric border space, with bins corresponding to the angle (3rd dimension) and distance (4th dimension) of a boundary relative to the animal's position.
  • Spatial correlations: Pearson's correlation between the spatial rate maps of the first and last regular recording session on a given day.
  • Allocentric MVL: Mean Vector Length (MVL) values in allocentric head-direction space.
  • Allocentric MVL stat: MVL statistic, corresponding to the percentile of the MVL relative to a 1000-fold time-shuffled null-distribution.
  • Egocentric MVL: MVL values of angles in egocentric boundary space.
  • Egocentric MVL stat: MVL statistic, corresponding to the percentile of the MVL relative to a 1000-fold time-shuffled null-distribution.

Funding provided by: Japan Science and Technology Agency
Crossref Funder Registry ID: http://dx.doi.org/10.13039/501100002241
Award Number: JPMJPR1682

Funding provided by: H2020 European Research Council
Crossref Funder Registry ID: http://dx.doi.org/10.13039/100010663
Award Number: 714642

Funding provided by: Max-Planck-Gesellschaft
Crossref Funder Registry ID: http://dx.doi.org/10.13039/501100004189

Funding provided by: Behrens-Weise-Foundation
Crossref Funder Registry ID:

Files

Files (125.2 MB)

Name Size Download all
md5:cfeb5dc3d8b1cb77efb12da34df0a888
125.2 MB Download