There is a newer version of this record available.

Dataset Restricted Access

K-EmoCon, a multimodal sensor dataset for continuous emotion recognition in naturalistic conversations

Cheul Young Park; Narae Cha; Soowon Kang; Auk Kim; Ahsan Habib Khandoker; Leontios Hadjileontiadis; Alice Oh; Yong Jeong; Uichin Lee

While recognizing emotions during social interactions has many potential applications with the popularization of low-cost mobile sensors, the heavy regulation of emotional behaviors in the wild compounds its difficulty. Therefore, studying emotions in the context of social interactions requires a novel dataset comprising multiple modalities and perspectives. K-EmoCon is such a multimodal dataset with comprehensive annotations of continuous emotions during naturalistic conversations. The dataset contains multimodal measurements, including audiovisual recordings, EEG, and peripheral physiological signals, acquired with off-the-shelf devices during 16 approximately 10 minutes long paired debates on a social issue. Distinct from previous datasets, it includes emotion annotations from all three available perspectives: self, debate partner, and external observers. Raters annotated emotional displays with 5 seconds intervals while viewing the debate footage, in terms of arousal-valence and 18 additional categorical emotions. The resulting K-EmoCon is the first multimodal dataset to accommodate the multiperspective assessment of emotions during social interactions.

Restricted Access

You may request access to the files in this upload, provided that you fulfil the conditions below. The decision whether to grant/deny access is solely under the responsibility of the record owner.

A user must fill out the following form to be granted access to the dataset:

Once the form is submitted, we will review it and decide whether to grant or deny access.

All versions This version
Views 2,445815
Downloads 48018
Data volume 821.6 GB12.0 GB
Unique views 1,544695
Unique downloads 1448


Cite as