Published June 21, 2024 | Version v1
Journal article Open

Localizing 3D motion through the fingertips: Following in the footsteps of elephants

  • 1. The Baruch Ivcher Institute for Brain, Cognition, and Technology, The Baruch Ivcher School of Psychology, Reichman University
  • 2. World Hearing Centre, Institute of Physiology and Pathology of Hearing

Description

Each sense serves a different specific function in spatial perception, and they all form a joint multisensory spatial representation. For instance, hearing enables localization in the entire 3D external space, while touch traditionally only allows localization of objects on the body (i.e., within the peripersonal space alone). We use an in-house touch-motion algorithm (TMA) to evaluate individuals’ capability to understand externalized 3D information through touch, a skill that was not acquired during an individual’s development or in evolution. Four experiments demonstrate quick learning and high accuracy in localization of motion using vibrotactile inputs on fingertips and successful audio-tactile integration in background noise. Subjective responses in some participants imply spatial experiences through visualization and perception of tactile “moving” sources beyond reach. We discuss our findings with respect to developing new skills in an adult brain, including combining a newly acquired “sense” with an existing one and computation-based brain organization.

Files

Localizing 3D motion through the fingertips - Following in the footsteps of elephants.pdf

Additional details

Funding

European Commission
GuestXR - GuestXR: A Machine Learning Agent for Social Harmony in eXtended Reality 101017884