Published June 1, 2013 | Version v1
Conference paper Open

Expressive Control of Indirect Augmented Reality During Live Music Performances

Description

Nowadays many music artists rely on visualisations and light shows to enhanceand augment their live performances. However, the visualisation and triggeringof lights is normally scripted in advance and synchronised with the concert,severely limiting the artist's freedom for improvisation, expression and ad-hocadaptation of their show. These scripts result in performances where thetechnology enforces the artist and their music to stay in synchronisation withthe pre-programmed environment. We argue that these limitations can be overcomebased on emerging non-invasive tracking technologies in combination with anadvanced gesture recognition engine.We present a solution that uses explicit gestures and implicit dance moves tocontrol the visual augmentation of a live music performance. We furtherillustrate how our framework overcomes existing limitations of gestureclassification systems by delivering a precise recognition solution based on asingle gesture sample in combination with expert knowledge. The presentedsolution enables a more dynamic and spontaneous performance and, when combinedwith indirect augmented reality, results in a more intense interaction betweenthe artist and their audience.

Files

nime2013_32.pdf

Files (6.9 MB)

Name Size Download all
md5:767d76ad5b118d23396b820d26f022ac
6.9 MB Preview Download