Info: Zenodo’s user support line is staffed on regular business days between Dec 23 and Jan 5. Response times may be slightly longer than normal.

Published June 1, 2013 | Version v1
Conference paper Open

Enabling Multimodal Mobile Interfaces for Musical Performance

Description

We present research that extends the scope of the mobile application Control, aprototyping environment for defining multimodal interfaces that controlreal-time artistic and musical performances. Control allows users to rapidlycreate interfaces employing a variety of modalities, including: speechrecognition, computer vision, musical feature extraction, touchscreen widgets,and inertial sensor data. Information from these modalities can be transmittedwirelessly to remote applications. Interfaces are declared using JSON and canbe extended with JavaScript to add complex behaviors, including the concurrentfusion of multimodal signals. By simplifying the creation of interfaces viathese simple markup files, Control allows musicians and artists to make novelapplications that use and combine both discrete and continuous data from thewide range of sensors available on commodity mobile devices.

Files

nime2013_303.pdf

Files (1.6 MB)

Name Size Download all
md5:d8b8b0fa99ab0fbd4059662e4ac9de18
1.6 MB Preview Download