Published May 10, 2022 | Version 1.00
Dataset Open

SaGA++ Speech-Gesture Dataset Extension

Description

This is the dataset extension release for the SaGA dataset.

We have extracted anonymous features from all the 25 recordings and release them all now instead of only 6 recordings that were previously released. The modalities included are text annotations, gesture annotations, prosodic audio features, and movement trajectories. More details are provided in the README document included in the dataset.

Please reference the following papers in any publication making use of the Dataset:

Kucherenko, T., Nagy, R., Neff, M., Kjellström, H., & Henter, G. E. (2022). Multimodal analysis of the predictability of hand-gesture properties. 21st International Conference on Autonomous Agents and Multiagent Systems (AAMAS). 

Lücking, A., Bergman, K., Hahn, F., Kopp, S., & Rieser, H. (2013). Data-based analysis of speech and gesture: The Bielefeld Speech and Gesture Alignment Corpus (SaGA) and its applications. Journal on Multimodal User Interfaces7(1), 5-18.

Files

SaGA++ Dataset.zip

Files (367.2 MB)

Name Size Download all
md5:d131796cd4de4207d3db5c02578f462a
367.2 MB Preview Download

Additional details

Related works

Cites
Dataset: 10.1007/s12193-012-0106-8 (DOI)
Conference paper: 10.48550/arXiv.2108.05762 (DOI)

References

  • Taras Kucherenko, Rajmund Nagy, Michael Neff, Hedvig Kjellström, and Gustav Eje Henter. Multimodal analysis of the predictability of hand-gesture properties. 21st International Conference on Autonomous Agents and Multiagent Systems (AAMAS). 2022
  • Lücking, A., Bergman, K., Hahn, F. et al. Data-based analysis of speech and gesture: the Bielefeld Speech and Gesture Alignment corpus (SaGA) and its applications. J Multimodal User Interfaces 7, 5–18 (2013).