Published July 19, 2017 | Version v1
Video/Audio Open

Neural Touch-Screen Ensemble Performance 2017-07-03

  • 1. University of Oslo

Description

A studio performance of an RNN-controlled Touch Screen Ensemble from 2017-07-03 at the University of Oslo.

In this performance, a touch-screen musician improvises with a computer-controlled ensemble of three artificial performers. A recurrent neural network tracks the touch gestures of the human performer and predicts musically appropriate gestural responses for the three artificial musicians. The performances on the three 'AI' iPads are then constructed from matching snippets of previous human recordings. A plot of the whole ensemble's touch gestures are shown on the projected screen.

This performance uses Metatone Classifier (https://doi.org/10.5281/zenodo.51712) to track touch gestures and Gesture-RNN (https://github.com/cpmpercussion/gesture-rnn) to predict gestural states for the ensemble. The touch-screen app used in this performance was PhaseRings (https://doi.org/10.5281/zenodo.50860).

Notes

This work is supported by The Research Council of Norway as a part of the Engineering Predictability with Embodied Cognition (EPEC) project, under grant agreement 240862.

Files

neural-touch-screen-session-2017-07-03.mp4

Files (248.5 MB)

Name Size Download all
md5:ed4904bb1d053d17cc9c953a3593962e
248.5 MB Preview Download

Additional details

References