58408
doi
10.5281/zenodo.58408
oai:zenodo.org:58408
Altuğ Karakurt
otmm_makam_recognition_dataset: Ottoman-Turkish Makam Music Makam Recognition Dataset
Sertan Şentürk
Universitat Pompeu Fabra
url:https://github.com/MTG/otmm_makam_recognition_dataset/tree/dlfm2016
info:eu-repo/semantics/openAccess
Other (Open)
<p>OTMM Makam Recognition Dataset</p>
<p>This repository hosts the dataset designed to test makam recognition methodologies on Ottoman-Turkish makam music. It is composed of 50 recording from each of the 20 most common makams in <a href="http://compmusic.upf.edu/">CompMusic Project</a>'s <a href="http://dunya.compmusic.upf.edu/">Dunya</a> Ottoman-Turkish Makam Music collection. Currently the dataset is the largest makam recognition dataset.</p>
<p>Please cite the publication below, if you use this dataset in your work:</p>
<blockquote>
<p>Karakurt, A., Şentürk S., & Serra X. (2016). <a href="http://mtg.upf.edu/node/3538">MORTY: A Toolbox for Mode Recognition and Tonic Identification</a>. 3rd International Digital Libraries for Musicology Workshop. New York, NY</p>
</blockquote>
<p>The recordings are selected from commercial recordings carefully such that they cover diverse musical forms, vocal/instrumentation settings and recording qualities (e.g. historical recordings vs. contemporary recordings). Each recording in the dataset is identified by an 16-character long unique identifier called MBID, hosted in <a href="http://musicbrainz.org">MusicBrainz</a>. The makam and the tonic of each recording is annotated in the file <a href="https://github.com/MTG/otmm_makam_recognition_dataset/blob/master/annotations.json">annotations.json</a>.</p>
<p>The audio related data in the test dataset is organized by each makam in the folder <a href="https://github.com/MTG/otmm_makam_recognition_dataset/blob/master/data">data</a>. Due to copyright reasons, we are unable to distribute the audio. Instead we provide the predominant melody of each recording, computed by a state-of-the-art <a href="https://github.com/sertansenturk/predominantmelodymakam/commit/f8b7302bc657f90e2b10a0ffd988902935adc3d6">predominant melody extraction algorithm</a> optimized for OTMM culture. These features are saved as text files (with the paths data/[makam]/[mbid].pitch) of single column that contains the frequency values. The timestamps are removed to reduce the filesizes. The step size of the pitch track is 0.0029 seconds (an analysis window of 128 sample hop size of an mp3 with 44100 Hz sample rate), with which one can recompute the timestamps of samples.</p>
<p>Moreover the metadata of each recording is available in the repository, crawled from MusicBrainz using an <a href="https://github.com/sertansenturk/makammusicbrainz">open source tool developed by us</a>. The metadata files are saved as data/[makam]/[mbid].json.</p>
<p>For reproducability purposes we note the version of all tools we have used to generate this dataset in the file <a href="https://github.com/MTG/otmm_makam_recognition_dataset/blob/master/algorithms.json">algorithms.json</a>.</p>
<p>A complementary toolbox for this dataset is <a href="https://github.com/altugkarakurt/morty">MORTY</a>, which is a mode recogition and tonic identification toolbox. It can be used and optimized for any modal music culture. Further details are explained in the publication above.</p>
<p>For more information, please contact the authors.</p>
<p><br>
This work is licensed under a <a href="http://creativecommons.org/licenses/by-nc-sa/4.0/">Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License</a>.</p>
Zenodo
2016-07-21
info:eu-repo/semantics/other
597874
dlfm2016
1659713579.578762
100886527
md5:617e22ddedcf82e75e5d132f2eab5792
https://zenodo.org/records/58408/files/otmm_makam_recognition_dataset-dlfm2016.zip
public
https://github.com/MTG/otmm_makam_recognition_dataset/tree/dlfm2016
Is supplement to
url
10.5281/zenodo.597874
isVersionOf
doi