Dataset Restricted Access
Garg, Nikhil; Garg, Rohit; Anand, Apoorv; Baths, Veeky
<?xml version='1.0' encoding='UTF-8'?> <record xmlns="http://www.loc.gov/MARC21/slim"> <leader>00000nmm##2200000uu#4500</leader> <datafield tag="260" ind1=" " ind2=" "> <subfield code="c">2022-11-17</subfield> </datafield> <controlfield tag="005">20221118142635.0</controlfield> <datafield tag="500" ind1=" " ind2=" "> <subfield code="a">Please cite as: N. Garg, R. Garg, A. Anand, V. Baths, "Decoding the Neural Signatures of Valence and Arousal From Portable EEG Headset," Frontiers in Human Neuroscience, Nov. 2022. Doi: 10.3389/fnhum.2022.1051463</subfield> </datafield> <controlfield tag="001">7332684</controlfield> <datafield tag="909" ind1="C" ind2="O"> <subfield code="p">openaire_data</subfield> <subfield code="o">oai:zenodo.org:7332684</subfield> </datafield> <datafield tag="909" ind1="C" ind2="4"> <subfield code="p">Frontiers in Human Neuroscience</subfield> </datafield> <datafield tag="520" ind1=" " ind2=" "> <subfield code="a"><p>Emotion classification using electroencephalography (EEG) data and machine learning techniques have been on the rise in the recent past. However, past studies use data from medical-grade EEG setups with long set-up times and environment constraints. The images from the OASIS image dataset were used to elicit valence and arousal emotions, and the EEG data was recorded using the Emotiv Epoc X mobile EEG headset. We propose a novel feature ranking technique and incremental learning approach to analyze performance dependence on the number of participants. The analysis is carried out on publicly available datasets: DEAP and DREAMER for benchmarking. Leave-one-subject-out cross-validation was carried out to identify subject bias in emotion elicitation patterns. The collected dataset and pipeline are made open source.&nbsp;</p> <p>Code: <a href="https://github.com/rohitgarg025/Decoding_EEG">https://github.com/rohitgarg025/Decoding_EEG</a></p></subfield> </datafield> <datafield tag="700" ind1=" " ind2=" "> <subfield code="u">Birla Institute of Technology and Science, India</subfield> <subfield code="a">Garg, Rohit</subfield> </datafield> <datafield tag="700" ind1=" " ind2=" "> <subfield code="u">Birla Institute of Technology and Science, India</subfield> <subfield code="a">Anand, Apoorv</subfield> </datafield> <datafield tag="700" ind1=" " ind2=" "> <subfield code="u">Birla Institute of Technology and Science, India</subfield> <subfield code="a">Baths, Veeky</subfield> </datafield> <datafield tag="542" ind1=" " ind2=" "> <subfield code="l">restricted</subfield> </datafield> <datafield tag="980" ind1=" " ind2=" "> <subfield code="a">dataset</subfield> </datafield> <datafield tag="100" ind1=" " ind2=" "> <subfield code="u">UMR8520 Institut d'électronique, de microélectronique et de nanotechnologie (IEMN), France</subfield> <subfield code="a">Garg, Nikhil</subfield> </datafield> <datafield tag="653" ind1=" " ind2=" "> <subfield code="a">Electroencephalography (EEG)</subfield> </datafield> <datafield tag="653" ind1=" " ind2=" "> <subfield code="a">Brain Computer Interface (BCI)</subfield> </datafield> <datafield tag="653" ind1=" " ind2=" "> <subfield code="a">Machine learning</subfield> </datafield> <datafield tag="653" ind1=" " ind2=" "> <subfield code="a">Valence</subfield> </datafield> <datafield tag="653" ind1=" " ind2=" "> <subfield code="a">Arousal</subfield> </datafield> <datafield tag="653" ind1=" " ind2=" "> <subfield code="a">Emotion</subfield> </datafield> <datafield tag="653" ind1=" " ind2=" "> <subfield code="a">Feature engineering</subfield> </datafield> <datafield tag="024" ind1=" " ind2=" "> <subfield code="a">10.3389/fnhum.2022.1051463</subfield> <subfield code="2">doi</subfield> </datafield> <datafield tag="245" ind1=" " ind2=" "> <subfield code="a">OASIS EEG Dataset: Decoding the Neural Signatures of Valence and Arousal From Portable EEG Headset</subfield> </datafield> </record>
Views | 226 |
Downloads | 48 |
Data volume | 65.4 MB |
Unique views | 179 |
Unique downloads | 7 |