Conference paper Open Access

Predictive Musical Interaction with MDRNNs

Martin, Charles Patrick; Torresen, Jim

An extended abstract and poster presented at the NeurIPS 2018 Workshop on Machine Learning for Creativity and Design, Montréal Canada.

Abstract:

This paper is about creating digital musical instruments (DMIs) where a predictive model is integrated into the interactive system. Rather than predicting symbolic music (e.g., MIDI notes), our systems predict future control data from the user and precise temporal information. We propose that a mixture density recurrent neural network (MDRNN) is an appropriate model for this task. The predictions can be used to fill-in control data for when the user stops performing, or as a kind of "filter" on the user’s input. We describe our motivations, two NIMEs applying this idea, and future directions.

Presented at NeurIPS 2018 Workshop on Machine Learning for Creativity and Design, Montréal Canada. This work is supported by The Research Council of Norway as part of the Engineering Predictability with Embodied Cognition (EPEC) project #240862 and Centres of Excellence scheme #262762.
Files (1.3 MB)
Name Size
2018-predictive-musical-interaction-with-MDRNNs.pdf
md5:69f3563ff513f77965077959e6351e70
1.3 MB Download
61
33
views
downloads
All versions This version
Views 6161
Downloads 3333
Data volume 42.0 MB42.0 MB
Unique views 5858
Unique downloads 3333

Share

Cite as