Published June 1, 2019
| Version v1
Conference paper
Open
A Physical Intelligent Instrument using Recurrent Neural Networks
Description
This paper describes a new intelligent interactive instrument, based on an embedded computing platform, where deep neural networks are applied to interactive music generation. Even though using neural networks for music composition is not uncommon, a lot of these models tend to not support any form of user interaction. We introduce a self-contained intelligent instrument using generative models, with support for real-time interaction where the user can adjust high-level parameters to modify the music generated by the instrument. We describe the technical details of our generative model and discuss the experience of using the system as part of musical performance.
Files
nime2019_paper016.pdf
Files
(3.1 MB)
Name | Size | Download all |
---|---|---|
md5:dbeb3b0a35d4abb408e829476b83aa99
|
3.1 MB | Preview Download |