Published November 3, 2025 | Version v1
Conference paper Open

Feature-Based Modelling of Perceived Emotion in Film Music

Description

This study investigates how musical features in film scores relate to perceived emotional expression over time. Film music differs from general music, as it is designed to manipulate narrative, tension, and audience perception, providing context-specific emotional cues beyond melody or harmony. The study is based on the FME-24 dataset, a collection of 300 professionally composed film score excerpts (2002–2024) covering both contemporary and traditional practices, through which perceptual, rhythmic, and tonal features were analysed. Participants marked moments of perceived emotional change, described the emotion, and placed it in a valence–arousal space. The preliminary results showed that emotions were clearly perceived but often difficult to verbalize. Analysis showed weak but significant correlations for certain features (e.g. ZCR and arousal), while chord types influenced arousal more strongly. Rhythmic and tonal features showed varied relationships with both dimensions. Arousal was generally perceived more consistently than valence. Isolating audio enables more precise mapping between musical features and perceived emotion, establishing a baseline for future audiovisual comparisons. Variability in participant reports highlights the subjectivity of film music perception and supports further feature-based modelling of emotional dynamics in cinematic scoring.

Files

CMMR2025_P1_2.pdf

Files (896.6 kB)

Name Size Download all
md5:0e3f889b05f525f664fc027d5e192680
896.6 kB Preview Download