Rhythmic Accompaniment Generation with Transformer Neural Networks
Description
Deep learning methods have recently emerged as state-of-the-art for many tasks related to automatic music generation. In particular, research on the task of accom-paniment generation has made great advances that have the potential to greatly en-hance tools for music production, performance, and education. Much of the work in this domain has focused primarily on generating either harmonic accompaniment or drum accompaniment. This thesis connects tonality and rhythm in an investigation of automatic rhythmic accompaniment for non-drum instruments — given a vocal melody, compose a rhythm for the bass. We create a dataset of multi-part rhythm pairs from the publicly available Lakh MIDI dataset [1] largely comprising contem-porary Western music, and use it to train a set of conditional generative models. We evaluate these models using two measures of distribution similarity, overlapping area (OA) and Kullback-Leibler divergence (KLD) [2], to determine their ability to reproduce the aspects of rhythm that we understand to be valuable. Addition-ally, we explore the use of rhythmic descriptors as methods of representing and understanding the rhythmic relationships between two accompanying rhythms. Our results show that the models are able to generally capture both individual rhythmic properties and relationships between pairs of rhythms in the training dataset. Fur-thermore, we demonstrate that good results can be achieved even with a relatively small model.
Files
Patricio-Ovalle-Master-Thesis-2023.pdf
Files
(3.2 MB)
| Name | Size | Download all |
|---|---|---|
|
md5:0f75fb08d5344dc4ac6e84373eb342dc
|
3.2 MB | Preview Download |