Published August 20, 2021 | Version v1
Journal article Open

Different mechanisms of magnitude and spatial representation for tactile and auditory modalities

Description

The human brain creates an external world representation based on magnitude judgments by estimating distance, numeros-
ity, or size. The magnitude and spatial representation are hypothesized to rely on common mechanisms shared by different
sensory modalities. We explored the relationship between magnitude and spatial representation using two different sensory
systems. We hypothesize that the interaction between space and magnitude is combined differently depending on sensory
modalities. Furthermore, we aimed to understand the role of the spatial reference frame in magnitude representation. We used
stimulus–response compatibility (SRC) to investigate these processes assuming that performance is improved if stimulus
and response share common features. We designed an auditory and tactile SRC task with conflicting spatial and magnitude
mapping. Our results showed that sensory modality modulates the relationship between space and magnitude. A larger effect
of magnitude over spatial congruency occurred in a tactile task. However, magnitude and space showed similar weight in the
auditory task, with neither spatial congruency nor magnitude congruency having a significant effect. Moreover, we observed
that the spatial frame activated during tasks was elicited by the sensory inputs. The participants' performance was reversed in
the tactile task between uncrossed and crossed hands posture, suggesting an internal coordinate system. In contrast, crossing
the hands did not alter performance (i.e., using an allocentric frame of reference). Overall, these results suggest that space
and magnitude interaction differ in auditory and tactile modalities, supporting the idea that these sensory modalities use
different magnitude and spatial representation mechanisms.

Files

Bollini2021_EXBR_DifferentMechanismsOfMagnitude.pdf

Files (1.4 MB)