Data from: Shared and modality-specific brain regions that mediate auditory and visual word comprehension
- 1. University of Dundee
- 2. University of Münster
- 3. Bielefeld University
Description
Visual speech carried by lip movements is an integral part of communication. Yet, it remains unclear in how far visual and acoustic speech comprehension are mediated by the same brain regions. Using multivariate classification of full-brain MEG data, we first probed where the brain represents acoustically and visually conveyed word identities. We then tested where these sensory-driven representations are predictive of participants' trial-wise comprehension. The comprehension-relevant representations of auditory and visual speech converged only in anterior angular and inferior frontal regions and were spatially dissociated from those representations that best reflected the sensory-driven word identity. These results provide a neural explanation for the behavioural dissociation of acoustic and visual speech comprehension and suggest that cerebral representations encoding word identities may be more modality-specific than often upheld.
Notes
Files
Data_auditory_visual_word_comprehension.zip
Files
(25.0 MB)
Name | Size | Download all |
---|---|---|
md5:ca9dc149a44e2035a5b5309fd250aca1
|
3.2 MB | Preview Download |
md5:0a6cba441e5634a5dca615e7298e9841
|
393.4 kB | Preview Download |
md5:08ca725aff8be301488d84a21dc854c8
|
10.4 MB | Download |
md5:d7fe2d411970b58ed511ee75aa305d58
|
659.8 kB | Preview Download |
md5:9b33c158d96d616b4f92a60a43fcc7d9
|
9.7 MB | Download |
md5:95482d058e2869ac74fb1643fc2538f0
|
603.3 kB | Preview Download |
Additional details
Related works
- Is supplemented by
- 10.5061/dryad.1qq7050 (DOI)