Published May 27, 2019
| Version 1.0.0
Dataset
Open
BabyRobot Emotion Database (BRED)
Creators
- 1. National Technical University of Athens
- 2. University of Thessaly
Description
BabyRobot Emotion Database (BRED)
This dataset includes extracted data from videos of children performing emotions. The dataset has a total of 215 samples and includes:
- Skeletons extracted by OpenPose.
- Facial landmarks extracted by the [OpenFace toolkit](OpenFace toolkit)
- Features extracted by a ResNet 50 network pretrained in the AffectNet database. Features are in PyTorch format.
- Annotations by three different annotators. The annotations are hierarchical and apart from the ground truth emotion, denote if the child used its body and/or face for performing the emotion.
- Note that there is an error in the annotations.csv file. Replace "spontaneous" in the path with "game" and "acted" with "pre-game" to get the correct paths.
We also provide the code on Github. The accompanied paper can be found on arXiv.
Files
BRED_dataset.zip
Files
(158.9 MB)
Name | Size | Download all |
---|---|---|
md5:24116c5e39cc52481b98e69408386d58
|
158.9 MB | Preview Download |