Dataset Open Access

Self-Motion Perception in the Elderly

Lich, Matthias; Bremmer, Frank


DataCite XML Export

<?xml version='1.0' encoding='utf-8'?>
<resource xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="http://datacite.org/schema/kernel-4" xsi:schemaLocation="http://datacite.org/schema/kernel-4 http://schema.datacite.org/meta/kernel-4.1/metadata.xsd">
  <identifier identifierType="URL">https://zenodo.org/record/824677</identifier>
  <creators>
    <creator>
      <creatorName>Lich, Matthias</creatorName>
      <givenName>Matthias</givenName>
      <familyName>Lich</familyName>
      <affiliation>Dept. Neurophysics, Philipps-Universität Marburg</affiliation>
    </creator>
    <creator>
      <creatorName>Bremmer, Frank</creatorName>
      <givenName>Frank</givenName>
      <familyName>Bremmer</familyName>
      <affiliation>Dept. Neurophysics, Philipps-Universität Marburg</affiliation>
    </creator>
  </creators>
  <titles>
    <title>Self-Motion Perception in the Elderly</title>
  </titles>
  <publisher>Zenodo</publisher>
  <publicationYear>2014</publicationYear>
  <subjects>
    <subject>heading, ageing, neural network model, area MST, virtual reality</subject>
  </subjects>
  <dates>
    <date dateType="Issued">2014-09-15</date>
  </dates>
  <resourceType resourceTypeGeneral="Dataset"/>
  <alternateIdentifiers>
    <alternateIdentifier alternateIdentifierType="url">https://zenodo.org/record/824677</alternateIdentifier>
  </alternateIdentifiers>
  <relatedIdentifiers>
    <relatedIdentifier relatedIdentifierType="DOI" relationType="IsIdenticalTo">10.3389/fnhum.2014.00681</relatedIdentifier>
  </relatedIdentifiers>
  <rightsList>
    <rights rightsURI="https://creativecommons.org/licenses/by/4.0/legalcode">Creative Commons Attribution 4.0 International</rights>
    <rights rightsURI="info:eu-repo/semantics/openAccess">Open Access</rights>
  </rightsList>
  <descriptions>
    <description descriptionType="Abstract">&lt;p&gt;Self-motion through space generates a visual pattern called optic flow. It can be used to determine one’s direction of self-motion (heading). Previous studies have already shown that this perceptual ability, which is of critical importance during everyday life, changes with age. In most of these studies subjects were asked to judge whether they appeared to be heading to the left or right of a target. Thresholds were found to increase continuously with age. In our current study, we were interested in absolute rather than relative heading judgements and in the question about a potential neural correlate of an age-related deterioration of heading perception. Two groups, older test subjects and younger controls, were shown optic flow stimuli in a virtual-reality setup. Visual stimuli simulated self-motion through a 3-D cloud of dots and subjects had to indicate their perceived heading direction after each trial. In different subsets of experiments we varied individually relevant stimulus parameters: presentation time, number of dots in the display, stereoscopic vs. non-stereoscopic stimulation, and motion coherence. We found decrements in heading performance with age for each stimulus parameter. In a final step we aimed to determine a putative neural basis of this behavioural decline. To this end we modified a neural network model which previously has proven to be capable of reproduce and predict certain aspects of heading perception. We show that the observed data can be modeled by implementing an age related neuronal cell loss in this neural network. We conclude that a continuous decline of certain aspects of motion perception, among them heading, might directly be based on an age-related progressive loss of groups of neurons being activated by visual motion. &lt;/p&gt;</description>
    <description descriptionType="Other">This work was supported by EU (EUROKINESIS and MEMORY) and Deutsche Forschungsgemeinschaft (SFB/TRR-135/A2).</description>
  </descriptions>
</resource>
73
2
views
downloads
Views 73
Downloads 2
Data volume 9.4 kB
Unique views 72
Unique downloads 2

Share

Cite as