Conference paper Open Access

Parameterized Melody Generation with Autoencoders and Temporally-Consistent Noise

Weber, Aline; Alegre, Lucas Nunes; Torresen, Jim; da Silva, Bruno C.

We introduce a machine learning technique to autonomously generate novel melodies that are variations of an arbitrary base melody. These are produced by a neural network that ensures that (with high probability) the melodic and rhythmic structure of the new melody is consistent with a given set of sample songs. We train a Variational Autoencoder network to identify a low-dimensional set of variables that allows for the compression and representation of sample songs. By perturbing these variables with Perlin Noise---a temporally-consistent parameterized noise function---it is possible to generate smoothly-changing novel melodies. We show that (1) by regulating the amount of noise, one can specify how much of the base song will be preserved; and (2) there is a direct correlation between the noise signal and the differences between the statistical properties of novel melodies and the original one. Users can interpret the controllable noise as a type of "creativity knob": the higher it is, the more leeway the network has to generate significantly different melodies. We present a physical prototype that allows musicians to use a keyboard to provide base melodies and to adjust the network's "creativity knobs" to regulate in real-time the process that proposes new melody ideas.

Files (1.3 MB)
Name Size
nime2019_paper035.pdf
md5:2533cae36d0c0e120e30359c1c703bf5
1.3 MB Download
33
27
views
downloads
All versions This version
Views 3333
Downloads 2727
Data volume 35.6 MB35.6 MB
Unique views 3030
Unique downloads 2424

Share

Cite as