Video/Audio Open Access

Neural Touch-Screen Ensemble Performance 2017-07-03

Martin, Charles Patrick


JSON-LD (schema.org) Export

{
  "description": "<p>A studio performance of an RNN-controlled Touch Screen Ensemble from 2017-07-03 at the University of Oslo.</p>\n\n<p>In this performance, a touch-screen musician improvises with a\u00a0computer-controlled ensemble of three artificial performers. A\u00a0recurrent neural network tracks\u00a0the touch gestures of the\u00a0human\u00a0performer and predicts musically appropriate gestural responses for the three artificial musicians. The performances on the three 'AI' iPads are then constructed from matching snippets of previous human recordings. A plot of the whole ensemble's touch gestures are shown on the projected screen.</p>\n\n<p>This performance uses Metatone Classifier (https://doi.org/10.5281/zenodo.51712)\u00a0to track touch gestures and Gesture-RNN (https://github.com/cpmpercussion/gesture-rnn) to predict\u00a0gestural states for the ensemble. The touch-screen app used in this performance was PhaseRings (https://doi.org/10.5281/zenodo.50860).</p>", 
  "license": "https://creativecommons.org/licenses/by/4.0/legalcode", 
  "creator": [
    {
      "affiliation": "University of Oslo", 
      "@type": "Person", 
      "name": "Martin, Charles Patrick"
    }
  ], 
  "url": "https://zenodo.org/record/831910", 
  "datePublished": "2017-07-19", 
  "keywords": [
    "performance, music, touch-screen, improvisation, recurrent neural network"
  ], 
  "@context": "https://schema.org/", 
  "identifier": "https://doi.org/10.5281/zenodo.831910", 
  "@id": "https://doi.org/10.5281/zenodo.831910", 
  "@type": "MediaObject", 
  "name": "Neural Touch-Screen Ensemble Performance 2017-07-03"
}
52
10
views
downloads
All versions This version
Views 5251
Downloads 1010
Data volume 2.5 GB2.5 GB
Unique views 5049
Unique downloads 1010

Share

Cite as