Published June 14, 2022 | Version v1
Conference paper Open

PEM360: A dataset of 360° videos with continuous Physiological measurements, subjective Emotional ratings and Motion traces

  • 1. Université Côte d'Azur, CNRS, I3S, France
  • 2. Université Côte d'Azur, Inria, CNRS, I3S, France
  • 3. Université Côte d'Azur, Inria
  • 4. Université Côte d'Azur, CNRS, Inria, I3S, France
  • 5. Université Côte d'Azur, CHU Nice, CobTek, France

Description

From a user perspective, immersive content can elicit more intense emotions than flat-screen presentations. From a system perspective, efficient storage and distribution remain challenging, and must consider user attention. Understanding the connection between user attention, user emotions and immersive content is therefore key. In this article, we present a new dataset, PEM360 of user head movements and gaze recordings in 360° videos, along with self-reported emotional ratings of valence and arousal, and continuous physiological measurement of electrodermal activity and heart rate. The stimuli are selected to enable the spatiotemporal analysis of the connection between content, user motion and emotion. We describe and provide a set of software tools to process the various data modalities, and introduce a joint instantaneous visualization of user attention and emotion we name Emotional maps. We exemplify new types of analyses the PEM360 dataset can enable. The entire data and code are made available in a reproducible framework.

Files

mmsys2022ods-final77.pdf

Files (8.4 MB)

Name Size Download all
md5:6f25ac1c0743db6cccaa3519b97fdb12
8.4 MB Preview Download

Additional details

Funding

UCA DS4H – UCA Systèmes Numériques pour l'Homme ANR-17-EURE-0004
Agence Nationale de la Recherche
AI4Media – A European Excellence Centre for Media, Society and Democracy 951911
European Commission