Software Open Access
Charles Martin
<?xml version='1.0' encoding='UTF-8'?> <record xmlns="http://www.loc.gov/MARC21/slim"> <leader>00000nmm##2200000uu#4500</leader> <datafield tag="653" ind1=" " ind2=" "> <subfield code="a">python, music, prediction, mdn, rnn, neural-network, mixture-density-network</subfield> </datafield> <controlfield tag="005">20200125072334.0</controlfield> <controlfield tag="001">2580176</controlfield> <datafield tag="856" ind1="4" ind2=" "> <subfield code="s">1238022</subfield> <subfield code="z">md5:cff4d07dfd4bf536a040aeb8903f7f0d</subfield> <subfield code="u">https://zenodo.org/record/2580176/files/cpmpercussion/imps-v0.1.zip</subfield> </datafield> <datafield tag="542" ind1=" " ind2=" "> <subfield code="l">open</subfield> </datafield> <datafield tag="260" ind1=" " ind2=" "> <subfield code="c">2019-02-28</subfield> </datafield> <datafield tag="909" ind1="C" ind2="O"> <subfield code="p">software</subfield> <subfield code="o">oai:zenodo.org:2580176</subfield> </datafield> <datafield tag="100" ind1=" " ind2=" "> <subfield code="u">University of Oslo</subfield> <subfield code="a">Charles Martin</subfield> </datafield> <datafield tag="245" ind1=" " ind2=" "> <subfield code="a">IMPS: The Interactive Musical Prediction System v0.1</subfield> </datafield> <datafield tag="540" ind1=" " ind2=" "> <subfield code="u">https://opensource.org/licenses/MIT</subfield> <subfield code="a">MIT License</subfield> </datafield> <datafield tag="650" ind1="1" ind2="7"> <subfield code="a">cc-by</subfield> <subfield code="2">opendefinition.org</subfield> </datafield> <datafield tag="520" ind1=" " ind2=" "> <subfield code="a"><p>IMPS: The Interactive Musical Predictive System</p> <p>IMPS is a system for predicting musical control data in live performance. It uses a mixture density recurrent neural network (MDRNN) to observe control inputs over multiple time steps, predicting the next value of each step, and the time that expects the next value to occur. It provides an input and output interface over OSC and can work with musical interfaces with any number of real-valued inputs (we&#39;ve tried from 1-8). Several interactive paradigms are supported for call-response improvisation, as well as independent operation, and &quot;filtering&quot; of the performer&#39;s input. Whenever you use IMPS, your input data is logged to build up a training corpus and a script is provided to train new versions of your model.</p> <p>Documentation can be found at:</p> <p><a href="https://github.com/cpmpercussion/imps">https://github.com/cpmpercussion/imps</a></p> <p>and</p> <p><a href="https://creativepredictions.xyz/imps">https://creativepredictions.xyz/imps</a></p> <p>&nbsp;</p></subfield> </datafield> <datafield tag="773" ind1=" " ind2=" "> <subfield code="n">url</subfield> <subfield code="i">isSupplementTo</subfield> <subfield code="a">https://github.com/cpmpercussion/imps/tree/v0.1</subfield> </datafield> <datafield tag="773" ind1=" " ind2=" "> <subfield code="n">doi</subfield> <subfield code="i">isVersionOf</subfield> <subfield code="a">10.5281/zenodo.2580175</subfield> </datafield> <datafield tag="024" ind1=" " ind2=" "> <subfield code="a">10.5281/zenodo.2580176</subfield> <subfield code="2">doi</subfield> </datafield> <datafield tag="980" ind1=" " ind2=" "> <subfield code="a">software</subfield> </datafield> </record>
All versions | This version | |
---|---|---|
Views | 44 | 23 |
Downloads | 14 | 4 |
Data volume | 19.5 MB | 5.0 MB |
Unique views | 37 | 18 |
Unique downloads | 10 | 3 |