4537751
doi
10.5281/zenodo.4537751
oai:zenodo.org:4537751
user-ieee
user-mir
user-eu
Gabriel Trégoat
Slim Essid
LTCI, Télécom Paris, Institut Polytechnique de Paris
Gaël Richard
LTCI, Télécom Paris, Institut Polytechnique de Paris
MAD-EEG: an EEG dataset for decoding auditory attention to a target instrument in polyphonic music
Giorgia Cantisani
LTCI, Télécom Paris, Institut Polytechnique de Paris
doi:10.21437/SMM.2019-11
info:eu-repo/semantics/openAccess
Creative Commons Attribution Share Alike 4.0 International
https://creativecommons.org/licenses/by-sa/4.0/legalcode
Auditory attention decoding
EEG
Polyphonic music
<p>The <em><strong>MAD-EEG Dataset</strong></em> is a research corpus for studying EEG-based auditory attention decoding to a target instrument in polyphonic music. </p>
<p>The dataset consists of 20-channel EEG responses to music recorded from 8 subjects while attending to a particular instrument in a music mixture. </p>
<p>For further details, please refer to the paper: <em><a href="https://hal.archives-ouvertes.fr/hal-02291882/document">MAD-EEG: an EEG dataset for decoding auditory attention to a target instrument in polyphonic music</a>.</em></p>
<p>If you use the data in your research, please reference the paper (not just the Zenodo record):</p>
<pre><code>@inproceedings{Cantisani2019,
author={Giorgia Cantisani and Gabriel Trégoat and Slim Essid and Gaël Richard},
title={{MAD-EEG: an EEG dataset for decoding auditory attention to a target instrument in polyphonic music}},
year=2019,
booktitle={Proc. SMM19, Workshop on Speech, Music and Mind 2019},
pages={51--55},
doi={10.21437/SMM.2019-11},
url={http://dx.doi.org/10.21437/SMM.2019-11}
}</code></pre>
<p> </p>
Zenodo
2019-09-19
info:eu-repo/semantics/other
4537750
user-ieee
user-mir
user-eu
1.0.0
award_title=New Frontiers in Music Information Processing; award_number=765068; award_identifiers_scheme=url; award_identifiers_identifier=https://cordis.europa.eu/projects/765068; funder_id=00k4n6c32; funder_name=European Commission;
1630668389.387025
13693
md5:bcd8f706f0c1ab0eee8fe3211f0d8cfc
https://zenodo.org/records/4537751/files/behavioural_data.xlsx
702084981
md5:795d7eea8f66550898ca2afeec55767c
https://zenodo.org/records/4537751/files/madeeg_raw.hdf5
153880
md5:1d093597df6fb1bada04903e20b06201
https://zenodo.org/records/4537751/files/madeeg_preprocessed.yaml
3725304231
md5:92f9c3684fe72203160b0838ff2bb0f7
https://zenodo.org/records/4537751/files/madeeg_preprocessed.hdf5
1411092
md5:67f281aba3e38caeabb627ef16a60830
https://zenodo.org/records/4537751/files/tutorial-MAD-EEG.ipynb
77459
md5:cda4950e78da7c9fe94b3c139b4d711f
https://zenodo.org/records/4537751/files/madeeg_raw.yaml
300700
md5:815475b21c289107a03cb355f2d24e96
https://zenodo.org/records/4537751/files/madeeg_sequences_raw.yaml
288402242
md5:6165f80d0bc09ece2c42b10098434533
https://zenodo.org/records/4537751/files/stimuli.zip
public
10.21437/SMM.2019-11
Documents
doi
10.5281/zenodo.4537750
isVersionOf
doi