Thesis Open Access
This master project aims to study music evoked emotion using electroencephalography (EEG) techniques. In particular, the first goal of the research is studying the correlation between EEG signal characteristics and three emotional states that are evoked by music; which are happy, sad and relax. A secondary goal of the research is correlating musical features obtained by music information retrieval (MIR) techniques with the EEG signals. An experiment was designed for proper EEG recording while subjects were listening to emotionally relevant songs. Six songs with emotional content (happy, sad and relax) were selected; three of them, were selected by us from a dataset of songs previously classified according to their emotional content by MIR techniques. The other three songs were selected by each participating subject according to their own preferences. The obtained data, EEG signals and music audio, were analyzed in order to investigate correlation among them. The EEG signals have been analyzed in order to extract features, some of them emotionally related, and these features have been used to predict the type of music to which the subject is listening. The obtained classifiers are able to predict the music type with up to 98% accuracy (base-line accuracy is 16%). The musical features were extracted in order to be correlated with the extracted EEG features. More than 10 correlations found with the correlation coefficient value greater than 0.25 and p-value smaller than 0.05.