Published December 8, 2018 | Version v1
Conference paper Open

Predictive Musical Interaction with MDRNNs

  • 1. University of Oslo

Description

An extended abstract and poster presented at the NeurIPS 2018 Workshop on Machine Learning for Creativity and Design, Montréal Canada.

Abstract:

This paper is about creating digital musical instruments (DMIs) where a predictive model is integrated into the interactive system. Rather than predicting symbolic music (e.g., MIDI notes), our systems predict future control data from the user and precise temporal information. We propose that a mixture density recurrent neural network (MDRNN) is an appropriate model for this task. The predictions can be used to fill-in control data for when the user stops performing, or as a kind of "filter" on the user’s input. We describe our motivations, two NIMEs applying this idea, and future directions.

Notes

Presented at NeurIPS 2018 Workshop on Machine Learning for Creativity and Design, Montréal Canada. This work is supported by The Research Council of Norway as part of the Engineering Predictability with Embodied Cognition (EPEC) project #240862 and Centres of Excellence scheme #262762.

Files

2018-predictive-musical-interaction-with-MDRNNs.pdf

Files (1.3 MB)