There is a newer version of the record available.

Published January 5, 2022 | Version 0
Dataset Open

eSEEd: emotional State Estimation based on Eye-tracking dataset

  • 1. Institute of Computer Science, Foundation for Research and Technology Hellas (FORTH), GR-700 13 Heraklion, Crete, Greece
  • 2. Institute of Computer Science, Foundation for Research and Technology Hellas (FORTH) and the Laboratory of Optics and Vision, School of Medicine, University of Crete, GR-710 03 Heraklion, Greece
  • 3. Department of Biomedical Research, Institute of Molecular Biology and Biotechnology, FORTH, GR-451 10, Ioannina, Greece
  • 4. Department of Materials Science and Engineering, Unit of Medical Technology and Intelligent Information Systems, University of Ioannina, GR-451 10, Ioannina, Greece
  • 5. Department of Biomedical Research, Institute of Molecular Biology and Biotechnology, FORTH, GR-451 15, Ioannina, Greece and the Department of Materials Science and Engineering, Unit of Medical Technology and Intelligent Information Systems, University of Ioannina, GR-451 10, Ioannina, Greece
  • 6. Institute of Computer Science, Foundation for Research and Technology Hellas (FORTH) and the Department of Electrical and Computer Engineering, Hellenic Mediterranean University, GR-710 04 Heraklion, Crete, Greece

Description

We present eSEEd- emotional State Estimation based on Eye-tracking database. Eye movements of 48 participants were recorded as they watched 10 emotion evoking videos each of them followed by a neutral video. Participants rated five emotions (tenderness, anger, disgust, sadness, neutral) on a scale from 0 to 10, later translated in terms of emotional arousal and valence levels. Furthermore, each participant filled 3 self-assessment questionnaires. An extensive analysis of the participants' answers to the questionnaires self-assessment scores as well as their ratings during the experiments is presented. Moreover, eye and gaze features were extracted from the low level eye recorded metrics and their correlations with the participants' ratings are investigated. Finally, analysis and results are presented for machine learning approaches, for the classification of various arousal and valence levels based solely on eye and gaze features. The dataset is made publicly available and we encourage other researchers to use it for testing new methods and analytic pipelines for the estimation of an individual's affective state.

Important note: Version 0 contains only video files and readme. The eye-tracking data are going to be uploaded in the next version.

Notes

This project has received funding from the European Union's Horizon 2020 research and innovation program under grant agreement No 826429 (Project: SeeFar). This paper reflects only the author's view and the Commission is not responsible for any use that may be made of the information it contains. Please cite:

Files

eSEEd_v0.1.zip

Files (473.1 MB)

Name Size Download all
md5:4b2aa54884c27dfffd377f97d44aa94c
236.6 MB Preview Download
md5:f75677f4d6f81677a6159a9ae7462480
236.6 MB Preview Download

Additional details

Related works