10.5281/zenodo.3762962
https://zenodo.org/records/3762962
oai:zenodo.org:3762962
Cheul Young Park
Cheul Young Park
0000-0003-0414-272X
Korea Advanced Institute of Science and Technology (KAIST)
Narae Cha
Narae Cha
Korea Advanced Institute of Science and Technology (KAIST)
Soowon Kang
Soowon Kang
Korea Advanced Institute of Science and Technology (KAIST)
Auk Kim
Auk Kim
Korea Advanced Institute of Science and Technology (KAIST)
Ahsan Habib Khandoker
Ahsan Habib Khandoker
Khalifa University
Leontios Hadjileontiadis
Leontios Hadjileontiadis
Khalifa University
Alice Oh
Alice Oh
Korea Advanced Institute of Science and Technology (KAIST)
Yong Jeong
Yong Jeong
Korea Advanced Institute of Science and Technology (KAIST)
Uichin Lee
Uichin Lee
Korea Advanced Institute of Science and Technology (KAIST)
K-EmoCon, a multimodal sensor dataset for continuous emotion recognition in naturalistic conversations
Zenodo
2020
Affective Computing
Human-Computer Interaction
2020-04-25
eng
10.5281/zenodo.3762961
0.1.0
While recognizing emotions during social interactions has many potential applications with the popularization of low-cost mobile sensors, the heavy regulation of emotional behaviors in the wild compounds its difficulty. Therefore, studying emotions in the context of social interactions requires a novel dataset comprising multiple modalities and perspectives. K-EmoCon is such a multimodal dataset with comprehensive annotations of continuous emotions during naturalistic conversations. The dataset contains multimodal measurements, including audiovisual recordings, EEG, and peripheral physiological signals, acquired with off-the-shelf devices during 16 approximately 10 minutes long paired debates on a social issue. Distinct from previous datasets, it includes emotion annotations from all three available perspectives: self, debate partner, and external observers. Raters annotated emotional displays with 5 seconds intervals while viewing the debate footage, in terms of arousal-valence and 18 additional categorical emotions. The resulting K-EmoCon is the first multimodal dataset to accommodate the multiperspective assessment of emotions during social interactions.