Video/Audio Open Access
Martin, Charles Patrick
A studio performance of an RNN-controlled Touch Screen Ensemble from 2017-07-03 at the University of Oslo.
In this performance, a touch-screen musician improvises with a computer-controlled ensemble of three artificial performers. A recurrent neural network tracks the touch gestures of the human performer and predicts musically appropriate gestural responses for the three artificial musicians. The performances on the three 'AI' iPads are then constructed from matching snippets of previous human recordings. A plot of the whole ensemble's touch gestures are shown on the projected screen.
This performance uses Metatone Classifier (https://doi.org/10.5281/zenodo.51712) to track touch gestures and Gesture-RNN (https://github.com/cpmpercussion/gesture-rnn) to predict gestural states for the ensemble. The touch-screen app used in this performance was PhaseRings (https://doi.org/10.5281/zenodo.50860).
Name | Size | |
---|---|---|
neural-touch-screen-session-2017-07-03.mp4
md5:ed4904bb1d053d17cc9c953a3593962e |
248.5 MB | Download |
Martin, C., & Swift, B. (2016). MetatoneClassifier: Research Prototype. Zenodo. http://doi.org/10.5281/zenodo.51712
Martin, C. (2016). PhaseRings v1.2.0. Zenodo. http://doi.org/10.5281/zenodo.50860
Martin, C., Gardner, H., & Swift, B. (2015). Tracking ensemble performance on touch-screens with gesture classification and transition matrices. In Proceedings of the International Conference on New Interfaces for Musical Expression, NIME '15, pages 359–364. http://www. nime.org/proceedings/2015/nime2015_242.pdf
All versions | This version | |
---|---|---|
Views | 52 | 51 |
Downloads | 10 | 10 |
Data volume | 2.5 GB | 2.5 GB |
Unique views | 50 | 49 |
Unique downloads | 10 | 10 |