Shepardson, Victor
Armitage, Jack
Magnusson, Thor
2022-09-17
<p>Deep learning-based probabilistic models of musical data are producing increasingly realistic results and promise to enter creative workflows of many kinds. Yet they have been little-studied in a performance setting, where the results of user actions typically ought to feel instantaneous. To enable such study, we designed Notochord, a deep probabilistic model for sequences of structured events, and trained an instance of it on the Lakh MIDI dataset. Our probabilistic formulation allows interpretable interventions at a sub-event level, which enables one model to act as a backbone for diverse interactive musical functions including steerable generation, harmonization, machine improvisation, and likelihood-based interfaces. Notochord can generate polyphonic and multi-track MIDI, and respond to inputs with latency below ten milliseconds. Training code, model checkpoints and interactive examples are provided as open source software.</p>
https://doi.org/10.5281/zenodo.7088404
oai:zenodo.org:7088404
AIMC
https://doi.org/10.5281/zenodo.7088403
info:eu-repo/semantics/openAccess
Creative Commons Attribution 4.0 International
https://creativecommons.org/licenses/by/4.0/legalcode
AIMC 2022, The 3rd Conference on AI Music Creativity, 13-15 September 2022
Notochord: a Flexible Probabilistic Model for Embodied MIDI Performance
info:eu-repo/semantics/conferencePaper