Dataset Open Access

Auditory Emotion Word Primes Influence Emotional Face Categorization in Children and Adults, but Not Vice Versa

Michael Vesker; Daniela Bahn; Christina Kauschke; Monika Tschense; Franziska Degé; Gudrun Schwarzer


In order to assess how the perception of audible speech and facial expressions influence one another for the perception of emotions, and how this influence might change over the course of development, we conducted two cross-modal priming experiments with three age groups of children (6-, 9-, and 12-years old), as well as college-aged adults. In Experiment 1, 74 children and 24 adult participants were tasked with categorizing photographs of emotional faces as positive or negative as quickly as possible after being primed with emotion words presented via audio in valence-congruent and valence-incongruent trials. In Experiment 2, 67 children and 24 adult participants carried out a similar categorization task, but with faces acting as visual primes, and emotion words acting as auditory targets. The results of Experiment 1 showed that participants made more errors when categorizing positive faces primed by negative words versus positive words, and that 6-year-old children are particularly sensitive to positive word primes, giving faster correct responses regardless of target valence. Meanwhile, the results of Experiment 2 did not show any congruency effects for priming by facial expressions. Thus, audible emotion words seem to exert an influence on the emotional categorization of faces, while faces do not seem to influence the categorization of emotion words in a significant way.

Raw data set for paper (same title and authors as the data-set) published in Frontiers in Psychology.
Files (1.1 MB)
Name Size
exp 1 data word prime face target.xlsx
607.3 kB Download
exp 2 data face prime word target.xlsx
513.7 kB Download
All versions This version
Views 7676
Downloads 3232
Data volume 18.5 MB18.5 MB
Unique views 7474
Unique downloads 2121


Cite as