Conference paper Open Access

A Physical Intelligent Instrument using Recurrent Neural Networks

Naess, Torgrim Rudland; Martin, Charles Patrick

This paper describes a new intelligent interactive instrument, based on an embedded computing platform, where deep neural networks are applied to interactive music generation. Even though using neural networks for music composition is not uncommon, a lot of these models tend to not support any form of user interaction. We introduce a self-contained intelligent instrument using generative models, with support for real-time interaction where the user can adjust high-level parameters to modify the music generated by the instrument. We describe the technical details of our generative model and discuss the experience of using the system as part of musical performance.

Files (3.1 MB)
Name Size
nime2019_paper016.pdf
md5:dbeb3b0a35d4abb408e829476b83aa99
3.1 MB Download
53
43
views
downloads
All versions This version
Views 5353
Downloads 4343
Data volume 132.8 MB132.8 MB
Unique views 4646
Unique downloads 3737

Share

Cite as