Cargando…

Decoding the neural signatures of valence and arousal from portable EEG headset

Emotion classification using electroencephalography (EEG) data and machine learning techniques have been on the rise in the recent past. However, past studies use data from medical-grade EEG setups with long set-up times and environment constraints. This paper focuses on classifying emotions on the...

Descripción completa

Detalles Bibliográficos
Autores principales: Garg, Nikhil, Garg, Rohit, Anand, Apoorv, Baths, Veeky
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Frontiers Media S.A. 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9764010/
https://www.ncbi.nlm.nih.gov/pubmed/36561835
http://dx.doi.org/10.3389/fnhum.2022.1051463
_version_ 1784853184918847488
author Garg, Nikhil
Garg, Rohit
Anand, Apoorv
Baths, Veeky
author_facet Garg, Nikhil
Garg, Rohit
Anand, Apoorv
Baths, Veeky
author_sort Garg, Nikhil
collection PubMed
description Emotion classification using electroencephalography (EEG) data and machine learning techniques have been on the rise in the recent past. However, past studies use data from medical-grade EEG setups with long set-up times and environment constraints. This paper focuses on classifying emotions on the valence-arousal plane using various feature extraction, feature selection, and machine learning techniques. We evaluate different feature extraction and selection techniques and propose the optimal set of features and electrodes for emotion recognition. The images from the OASIS image dataset were used to elicit valence and arousal emotions, and the EEG data was recorded using the Emotiv Epoc X mobile EEG headset. The analysis is carried out on publicly available datasets: DEAP and DREAMER for benchmarking. We propose a novel feature ranking technique and incremental learning approach to analyze performance dependence on the number of participants. Leave-one-subject-out cross-validation was carried out to identify subject bias in emotion elicitation patterns. The importance of different electrode locations was calculated, which could be used for designing a headset for emotion recognition. The collected dataset and pipeline are also published. Our study achieved a root mean square score (RMSE) of 0.905 on DREAMER, 1.902 on DEAP, and 2.728 on our dataset for valence label and a score of 0.749 on DREAMER, 1.769 on DEAP, and 2.3 on our proposed dataset for arousal label.
format Online
Article
Text
id pubmed-9764010
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher Frontiers Media S.A.
record_format MEDLINE/PubMed
spelling pubmed-97640102022-12-21 Decoding the neural signatures of valence and arousal from portable EEG headset Garg, Nikhil Garg, Rohit Anand, Apoorv Baths, Veeky Front Hum Neurosci Human Neuroscience Emotion classification using electroencephalography (EEG) data and machine learning techniques have been on the rise in the recent past. However, past studies use data from medical-grade EEG setups with long set-up times and environment constraints. This paper focuses on classifying emotions on the valence-arousal plane using various feature extraction, feature selection, and machine learning techniques. We evaluate different feature extraction and selection techniques and propose the optimal set of features and electrodes for emotion recognition. The images from the OASIS image dataset were used to elicit valence and arousal emotions, and the EEG data was recorded using the Emotiv Epoc X mobile EEG headset. The analysis is carried out on publicly available datasets: DEAP and DREAMER for benchmarking. We propose a novel feature ranking technique and incremental learning approach to analyze performance dependence on the number of participants. Leave-one-subject-out cross-validation was carried out to identify subject bias in emotion elicitation patterns. The importance of different electrode locations was calculated, which could be used for designing a headset for emotion recognition. The collected dataset and pipeline are also published. Our study achieved a root mean square score (RMSE) of 0.905 on DREAMER, 1.902 on DEAP, and 2.728 on our dataset for valence label and a score of 0.749 on DREAMER, 1.769 on DEAP, and 2.3 on our proposed dataset for arousal label. Frontiers Media S.A. 2022-12-06 /pmc/articles/PMC9764010/ /pubmed/36561835 http://dx.doi.org/10.3389/fnhum.2022.1051463 Text en Copyright © 2022 Garg, Garg, Anand and Baths. https://creativecommons.org/licenses/by/4.0/This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
spellingShingle Human Neuroscience
Garg, Nikhil
Garg, Rohit
Anand, Apoorv
Baths, Veeky
Decoding the neural signatures of valence and arousal from portable EEG headset
title Decoding the neural signatures of valence and arousal from portable EEG headset
title_full Decoding the neural signatures of valence and arousal from portable EEG headset
title_fullStr Decoding the neural signatures of valence and arousal from portable EEG headset
title_full_unstemmed Decoding the neural signatures of valence and arousal from portable EEG headset
title_short Decoding the neural signatures of valence and arousal from portable EEG headset
title_sort decoding the neural signatures of valence and arousal from portable eeg headset
topic Human Neuroscience
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9764010/
https://www.ncbi.nlm.nih.gov/pubmed/36561835
http://dx.doi.org/10.3389/fnhum.2022.1051463
work_keys_str_mv AT gargnikhil decodingtheneuralsignaturesofvalenceandarousalfromportableeegheadset
AT gargrohit decodingtheneuralsignaturesofvalenceandarousalfromportableeegheadset
AT anandapoorv decodingtheneuralsignaturesofvalenceandarousalfromportableeegheadset
AT bathsveeky decodingtheneuralsignaturesofvalenceandarousalfromportableeegheadset