Published July 7, 2018 | Version v1
Conference paper Open

A Collaborative System for Composing Music via Motion Using a Kinect Sensor and Skeletal Data

  • 1. School of Electrical and Computer Engineering, National Technical University of Athens

Description

This paper describes MoveSynth, a performance system for two players, who interact with it and collaborate with
each other in various ways, including full-body movements, arm postures and continuous gestures, to compose music
in real time. The system uses a Kinect sensor, in order to track the performers’ positions, as well as their arm and
hand movements. In the system’s current state, the musical parameters that the performers can influence include the
pitch and the volume of the music, the timbre of the sound, as well as the time interval between successive notes. We
extensively experimented using various classifiers in order to detect the one that gives the optimal results regarding
the task of continuous gesture and arm posture recognition, accomplishing 92.11% for continuous gestures and
99.33% for arm postures, using an 1-NN classifier with a condensed search space in both cases. Additionally, the
qualitative results of the usability testing of the final system, which was performed by 9 users, are encouraging and
identify possible avenues for further exploration and improvement.

Files

smc2018_garoufis.pdf

Files (918.7 kB)

Name Size Download all
md5:78841a643d1b59dc8783f1ac2d58a3d5
918.7 kB Preview Download

Additional details

Funding

European Commission
iMuSciCA - Interactive Music Science Collaborative Activities 731861