Published September 4, 2018 | Version v1
Dataset Open

Flute audio labelled database for Automatic Music Transcription

Description

Automatic Music Transcription (ATM) is a well-known task in the Music Information Retrieval (MIR) domain and consists on the computation of a symbolic music representation from an audio recording. In this work, our focus is to adapt algorithms that extract musical information from an audio file for a particular instrument. The main objective is to study the automatic transcription of digitized music support systems. Currently, these techniques are applied to a generic sound timbre, to sounds to any instrument for further analysis and conversion to a digital music encoding and final score format. The results of this project add new knowledge in this automatic transcription field, since traverse flute has been selected as the instrument on which to focus all the process and, until now, there is no database of flute sounds for this purpose.

For so, we have recorded some sounds, both monophonic and polyphonic music. These audio files have been processed by the chosen transcription algorithm and converted to a digital music encoding format for its posterior alignment with the original recordings. Once all these data have been converted to text, the resulting labeled database its constituted by the initial audios and final aligned files.

Furthermore, after this process and from the obtained data, an evaluation of the transcriptor behavior has been made based on two main techniques: note and frame level.

This database includes the original audio files (.wav), transcribed MIDI files (.mid), aligned MIDI files (.mid), aligned text files (.txt) and evaluation files (.csv).

Files

flute-audio-labelled-database-AMT.zip

Files (114.9 MB)

Name Size Download all
md5:4580cdad0f8d3b85ac3d1118d003ebf8
114.9 MB Preview Download