Planned intervention: On Thursday 19/09 between 05:30-06:30 (UTC), Zenodo will be unavailable because of a scheduled upgrade in our storage cluster.
Published June 1, 2012 | Version v1
Conference paper Open

Movement to emotions to music: using whole body emotional expression as an interaction for electronic music generation

Description

The augmented ballet project aims at gathering research from several fields and directing them towards a same application case: adding virtual elements (visual and acoustic) to a dance live performance, and allowing the dancer to interact with them. In this paper, we describe a novel interaction that we used in the frame of this project: using the dancer's movements to recognize the emotions he expresses, and use these emotions to generate musical audio flows evolving in real-time. The originality of this interaction is threefold. First, it covers the whole interaction cycle from the input (the dancer's movements) to the output (the generated music). Second, this interaction isn't direct but goes through a high level of abstraction: dancer's emotional expression is recognized and is the source of music generation. Third, this interaction has been designed and validated through constant collaboration with a choreographer, culminating in an augmented ballet performance in front of a live audience.

Files

nime2012_180.pdf

Files (567.3 kB)

Name Size Download all
md5:1a70b76576796c7a428910d9b3eccc8c
567.3 kB Preview Download