Published June 1, 2013 | Version v2
Conference paper Open

A Self-Organizing Gesture Map for a Voice-Controlled Instrument Interface

Description

ABSTRACT: Mapping gestures to digital musical instrument parameters is nottrivial when the dimensionality of the sensor-captured data is high and themodel relating the gesture to sensor data is unknown. In these cases, afront-end processing system for extracting gestural information embedded in thesensor data is essential. In this paper we propose an unsupervised offlinemethod that learns how to reduce and map the gestural data to a genericinstrument parameter control space. We make an unconventional use of theSelf-Organizing Maps to obtain only a geometrical transformation of thegestural data, while dimensionality reduction is handled separately. Weintroduce a novel training procedure to overcome two main Self- Organizing Mapslimitations which otherwise corrupt the interface usability. As evaluation, weapply this method to our existing Voice-Controlled Interface for musicalinstruments, obtaining sensible usability improvements.

Files

nime2013_50.pdf

Files (571.6 kB)

Name Size Download all
md5:bf754e1d2a434f79fc7ed72778582f54
571.6 kB Preview Download