Actions
  • shareshare
  • link
  • cite
  • add
add
auto_awesome_motion View all 2 versions
Research data . Dataset . 2019

MAD-EEG: an EEG dataset for decoding auditory attention to a target instrument in polyphonic music

Cantisani, Giorgia; Trégoat, Gabriel; Essid, Slim; Richard, Gaël;
Open Access   English  
Published: 19 Sep 2019
Publisher: Zenodo
Abstract
The MAD-EEG Dataset is a research corpus for studying EEG-based auditory attention decoding to a target instrument in polyphonic music. The dataset consists of 20-channel EEG responses to music recorded from 8 subjects while attending to a particular instrument in a music mixture. For further details, please refer to the paper: MAD-EEG: an EEG dataset for decoding auditory attention to a target instrument in polyphonic music. If you use the data in your research, please reference the paper (not just the Zenodo record): @inproceedings{Cantisani2019, author={Giorgia Cantisani and Gabriel Trégoat and Slim Essid and Gaël Richard}, title={{MAD-EEG: an EEG dataset for decoding auditory attention to a target instrument in polyphonic music}}, year=2019, booktitle={Proc. SMM19, Workshop on Speech, Music and Mind 2019}, pages={51--55}, doi={10.21437/SMM.2019-11}, url={http://dx.doi.org/10.21437/SMM.2019-11} }
Subjects

Auditory attention decoding, EEG, Polyphonic music

Funded by
EC| MIP-Frontiers
Project
MIP-Frontiers
New Frontiers in Music Information Processing
  • Funder: European Commission (EC)
  • Project Code: 765068
  • Funding stream: H2020 | MSCA-ITN-ETN