Software Open Access
Charles Martin
{ "description": "<p>IMPS: The Interactive Musical Predictive System</p>\n\n<p>IMPS is a system for predicting musical control data in live performance. It uses a mixture density recurrent neural network (MDRNN) to observe control inputs over multiple time steps, predicting the next value of each step, and the time that expects the next value to occur. It provides an input and output interface over OSC and can work with musical interfaces with any number of real-valued inputs (we've tried from 1-8). Several interactive paradigms are supported for call-response improvisation, as well as independent operation, and "filtering" of the performer's input. Whenever you use IMPS, your input data is logged to build up a training corpus and a script is provided to train new versions of your model.</p>\n\n<p>Documentation can be found at:</p>\n\n<p><a href=\"https://github.com/cpmpercussion/imps\">https://github.com/cpmpercussion/imps</a></p>\n\n<p>and</p>\n\n<p><a href=\"https://creativepredictions.xyz/imps\">https://creativepredictions.xyz/imps</a></p>\n\n<p> </p>", "license": "https://opensource.org/licenses/MIT", "creator": [ { "affiliation": "University of Oslo", "@type": "Person", "name": "Charles Martin" } ], "url": "https://zenodo.org/record/2580176", "codeRepository": "https://github.com/cpmpercussion/imps/tree/v0.1", "datePublished": "2019-02-28", "version": "v0.1", "keywords": [ "python, music, prediction, mdn, rnn, neural-network, mixture-density-network" ], "@context": "https://schema.org/", "identifier": "https://doi.org/10.5281/zenodo.2580176", "@id": "https://doi.org/10.5281/zenodo.2580176", "@type": "SoftwareSourceCode", "name": "IMPS: The Interactive Musical Prediction System v0.1" }
All versions | This version | |
---|---|---|
Views | 110 | 65 |
Downloads | 19 | 4 |
Data volume | 26.7 MB | 5.0 MB |
Unique views | 99 | 60 |
Unique downloads | 14 | 3 |