Dataset Open Access

MAD-EEG: an EEG dataset for decoding auditory attention to a target instrument in polyphonic music

Giorgia Cantisani; Gabriel Trégoat; Slim Essid; Gaël Richard


Dublin Core Export

<?xml version='1.0' encoding='utf-8'?>
<oai_dc:dc xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:oai_dc="http://www.openarchives.org/OAI/2.0/oai_dc/" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://www.openarchives.org/OAI/2.0/oai_dc/ http://www.openarchives.org/OAI/2.0/oai_dc.xsd">
  <dc:creator>Giorgia Cantisani</dc:creator>
  <dc:creator>Gabriel Trégoat</dc:creator>
  <dc:creator>Slim Essid</dc:creator>
  <dc:creator>Gaël Richard</dc:creator>
  <dc:date>2019-09-19</dc:date>
  <dc:description>The MAD-EEG Dataset is a research corpus for studying EEG-based auditory attention decoding to a target instrument in polyphonic music. 

The dataset consists of 20-channel EEG responses to music recorded from 8 subjects while attending to a particular instrument in a music mixture. 

For further details, please refer to the paper: MAD-EEG: an EEG dataset for decoding auditory attention to a target instrument in polyphonic music.

If you use the data in your research, please reference the paper (not just the Zenodo record):

@inproceedings{Cantisani2019,
  author={Giorgia Cantisani and Gabriel Trégoat and Slim Essid and Gaël Richard},
  title={{MAD-EEG: an EEG dataset for decoding auditory attention to a target instrument in polyphonic music}},
  year=2019,
  booktitle={Proc. SMM19, Workshop on Speech, Music and Mind 2019},
  pages={51--55},
  doi={10.21437/SMM.2019-11},
  url={http://dx.doi.org/10.21437/SMM.2019-11}
}

 </dc:description>
  <dc:identifier>https://zenodo.org/record/4537751</dc:identifier>
  <dc:identifier>10.5281/zenodo.4537751</dc:identifier>
  <dc:identifier>oai:zenodo.org:4537751</dc:identifier>
  <dc:language>eng</dc:language>
  <dc:relation>info:eu-repo/grantAgreement/EC/H2020/765068/</dc:relation>
  <dc:relation>doi:10.21437/SMM.2019-11</dc:relation>
  <dc:relation>doi:10.5281/zenodo.4537750</dc:relation>
  <dc:relation>url:https://zenodo.org/communities/ieee</dc:relation>
  <dc:relation>url:https://zenodo.org/communities/mir</dc:relation>
  <dc:rights>info:eu-repo/semantics/openAccess</dc:rights>
  <dc:rights>https://creativecommons.org/licenses/by-sa/4.0/legalcode</dc:rights>
  <dc:subject>Auditory attention decoding</dc:subject>
  <dc:subject>EEG</dc:subject>
  <dc:subject>Polyphonic music</dc:subject>
  <dc:title>MAD-EEG: an EEG dataset for decoding auditory attention to a target instrument in polyphonic music</dc:title>
  <dc:type>info:eu-repo/semantics/other</dc:type>
  <dc:type>dataset</dc:type>
</oai_dc:dc>
207
133
views
downloads
All versions This version
Views 207207
Downloads 133133
Data volume 122.7 GB122.7 GB
Unique views 175175
Unique downloads 5252

Share

Cite as