There is a newer version of the record available.

Published January 11, 2022 | Version 1.0
Dataset Open

DeepDance: Motion capture data of improvised dance (2019)

  • 1. University of Oslo
  • 2. Australian National University

Description

When using this resource, please cite Wallace, B., Nymoen, K., Martin, C.P & Tøressen, J. (2022). DeepDance: Motion capture data of improvised dance (2019) (version 1.0). Zenodo 10.5281/zenodo.5838179

 

Abstract

This dataset comprises full-body motion capture of dance as well as corresponding audio files. Nine dancers were recorded individually, improvising to six different audio files. The motion was captured in units of mm at 250Hz using a Qualisys infra-red optical system. The experiment was carried out at the University of Oslo in October 2019. For each dancer, 3 performances are recorded for each musical piece, resulting in 162 motion capture files. The dataset was collected for use as training data in deep learning for motion generation. This dataset also includes MATLAB code to visualize the motion capture files.

 

Music

  • Skarphedinsson, M. Wallace, B. (2019). “Song a”
  • Skarphedinsson, M. Wallace, B. (2019). “Song b”
  • Skarphedinsson, M. Wallace, B. (2019). “Song c”
  • Skarphedinsson, M. Wallace, B. (2019). “Song d”
  • Skarphedinsson, M. Wallace, B. (2019). “Song f”
  • LaClair, J. Bounce. Jesse LaClair, (2018) Referenced here as “Song e”

 

Data Description

The following data types are provided:

  • Motion (marker position): Recorded with Qualisys Track Manager and saved as tab-separated .tsv files.
  • Stimuli: audio .wav files containing 1 minute of the tracks described above.
  • Code: MATLAB script for visualizing the motion capture files.

Post-processing:

The motion capture files are downsampled to 30 fps.

Body segments have been standardized across all dancers in the data set and normalized so that the root marker is set at the origin. For further details please see [2].

 

Acknowledgements

This work was partially supported by the Research Council of Norway through its Centres of Excellence scheme, project number 262762.

 

Conflicts of Interest

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

 

Files

DeepDance_v01.zip

Files (206.2 MB)

Name Size Download all
md5:c90dd41703154702f4096aad7f1b39af
206.2 MB Preview Download

Additional details

References

  • Benedikte Wallace, Charles P Martin, Jim Tørresen, and Kristian Nymoen. 2021. Learning Embodied Sound-Motion Mappings: Evaluating AI-Generated Dance Improvisation. In Creativity and Cognition. 1–9
  • Benedikte Wallace, Charles P. Martin, Jim Torresen, and Kristian Nymoen. Towards movement generation with audio features. In Proceedings of the 11th International Conference on Computational Creativity, 2020.