Cargando…
Comparing Neural Correlates of Human Emotions across Multiple Stimulus Presentation Paradigms
Most electroencephalography (EEG)-based emotion recognition systems rely on a single stimulus to evoke emotions. These systems make use of videos, sounds, and images as stimuli. Few studies have been found for self-induced emotions. The question “if different stimulus presentation paradigms for same...
Autores principales: | , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
MDPI
2021
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8229332/ https://www.ncbi.nlm.nih.gov/pubmed/34070554 http://dx.doi.org/10.3390/brainsci11060696 |
_version_ | 1783712952872861696 |
---|---|
author | Masood, Naveen Farooq, Humera |
author_facet | Masood, Naveen Farooq, Humera |
author_sort | Masood, Naveen |
collection | PubMed |
description | Most electroencephalography (EEG)-based emotion recognition systems rely on a single stimulus to evoke emotions. These systems make use of videos, sounds, and images as stimuli. Few studies have been found for self-induced emotions. The question “if different stimulus presentation paradigms for same emotion, produce any subject and stimulus independent neural correlates” remains unanswered. Furthermore, we found that there are publicly available datasets that are used in a large number of studies targeting EEG-based human emotional state recognition. Since one of the major concerns and contributions of this work is towards classifying emotions while subjects experience different stimulus-presentation paradigms, we need to perform new experiments. This paper presents a novel experimental study that recorded EEG data for three different human emotional states evoked with four different stimuli presentation paradigms. Fear, neutral, and joy have been considered as three emotional states. In this work, features were extracted with common spatial pattern (CSP) from recorded EEG data and classified through linear discriminant analysis (LDA). The considered emotion-evoking paradigms included emotional imagery, pictures, sounds, and audio–video movie clips. Experiments were conducted with twenty-five participants. Classification performance in different paradigms was evaluated, considering different spectral bands. With a few exceptions, all paradigms showed the best emotion recognition for higher frequency spectral ranges. Interestingly, joy emotions were classified more strongly as compared to fear. The average neural patterns for fear vs. joy emotional states are presented with topographical maps based on spatial filters obtained with CSP for averaged band power changes for all four paradigms. With respect to the spectral bands, beta and alpha oscillation responses produced the highest number of significant results for the paradigms under consideration. With respect to brain region, the frontal lobe produced the most significant results irrespective of paradigms and spectral bands. The temporal site also played an effective role in generating statistically significant findings. To the best of our knowledge, no study has been conducted for EEG emotion recognition while considering four different stimuli paradigms. This work provides a good contribution towards designing EEG-based system for human emotion recognition that could work effectively in different real-time scenarios. |
format | Online Article Text |
id | pubmed-8229332 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2021 |
publisher | MDPI |
record_format | MEDLINE/PubMed |
spelling | pubmed-82293322021-06-26 Comparing Neural Correlates of Human Emotions across Multiple Stimulus Presentation Paradigms Masood, Naveen Farooq, Humera Brain Sci Article Most electroencephalography (EEG)-based emotion recognition systems rely on a single stimulus to evoke emotions. These systems make use of videos, sounds, and images as stimuli. Few studies have been found for self-induced emotions. The question “if different stimulus presentation paradigms for same emotion, produce any subject and stimulus independent neural correlates” remains unanswered. Furthermore, we found that there are publicly available datasets that are used in a large number of studies targeting EEG-based human emotional state recognition. Since one of the major concerns and contributions of this work is towards classifying emotions while subjects experience different stimulus-presentation paradigms, we need to perform new experiments. This paper presents a novel experimental study that recorded EEG data for three different human emotional states evoked with four different stimuli presentation paradigms. Fear, neutral, and joy have been considered as three emotional states. In this work, features were extracted with common spatial pattern (CSP) from recorded EEG data and classified through linear discriminant analysis (LDA). The considered emotion-evoking paradigms included emotional imagery, pictures, sounds, and audio–video movie clips. Experiments were conducted with twenty-five participants. Classification performance in different paradigms was evaluated, considering different spectral bands. With a few exceptions, all paradigms showed the best emotion recognition for higher frequency spectral ranges. Interestingly, joy emotions were classified more strongly as compared to fear. The average neural patterns for fear vs. joy emotional states are presented with topographical maps based on spatial filters obtained with CSP for averaged band power changes for all four paradigms. With respect to the spectral bands, beta and alpha oscillation responses produced the highest number of significant results for the paradigms under consideration. With respect to brain region, the frontal lobe produced the most significant results irrespective of paradigms and spectral bands. The temporal site also played an effective role in generating statistically significant findings. To the best of our knowledge, no study has been conducted for EEG emotion recognition while considering four different stimuli paradigms. This work provides a good contribution towards designing EEG-based system for human emotion recognition that could work effectively in different real-time scenarios. MDPI 2021-05-25 /pmc/articles/PMC8229332/ /pubmed/34070554 http://dx.doi.org/10.3390/brainsci11060696 Text en © 2021 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). |
spellingShingle | Article Masood, Naveen Farooq, Humera Comparing Neural Correlates of Human Emotions across Multiple Stimulus Presentation Paradigms |
title | Comparing Neural Correlates of Human Emotions across Multiple Stimulus Presentation Paradigms |
title_full | Comparing Neural Correlates of Human Emotions across Multiple Stimulus Presentation Paradigms |
title_fullStr | Comparing Neural Correlates of Human Emotions across Multiple Stimulus Presentation Paradigms |
title_full_unstemmed | Comparing Neural Correlates of Human Emotions across Multiple Stimulus Presentation Paradigms |
title_short | Comparing Neural Correlates of Human Emotions across Multiple Stimulus Presentation Paradigms |
title_sort | comparing neural correlates of human emotions across multiple stimulus presentation paradigms |
topic | Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8229332/ https://www.ncbi.nlm.nih.gov/pubmed/34070554 http://dx.doi.org/10.3390/brainsci11060696 |
work_keys_str_mv | AT masoodnaveen comparingneuralcorrelatesofhumanemotionsacrossmultiplestimuluspresentationparadigms AT farooqhumera comparingneuralcorrelatesofhumanemotionsacrossmultiplestimuluspresentationparadigms |