Cargando…

Comprehensive Analysis of Feature Extraction Methods for Emotion Recognition from Multichannel EEG Recordings

Advances in signal processing and machine learning have expedited electroencephalogram (EEG)-based emotion recognition research, and numerous EEG signal features have been investigated to detect or characterize human emotions. However, most studies in this area have used relatively small monocentric...

Descripción completa

Detalles Bibliográficos
Autores principales: Yuvaraj, Rajamanickam, Thagavel, Prasanth, Thomas, John, Fogarty, Jack, Ali, Farhan
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9867328/
https://www.ncbi.nlm.nih.gov/pubmed/36679710
http://dx.doi.org/10.3390/s23020915
_version_ 1784876314955612160
author Yuvaraj, Rajamanickam
Thagavel, Prasanth
Thomas, John
Fogarty, Jack
Ali, Farhan
author_facet Yuvaraj, Rajamanickam
Thagavel, Prasanth
Thomas, John
Fogarty, Jack
Ali, Farhan
author_sort Yuvaraj, Rajamanickam
collection PubMed
description Advances in signal processing and machine learning have expedited electroencephalogram (EEG)-based emotion recognition research, and numerous EEG signal features have been investigated to detect or characterize human emotions. However, most studies in this area have used relatively small monocentric data and focused on a limited range of EEG features, making it difficult to compare the utility of different sets of EEG features for emotion recognition. This study addressed that by comparing the classification accuracy (performance) of a comprehensive range of EEG feature sets for identifying emotional states, in terms of valence and arousal. The classification accuracy of five EEG feature sets were investigated, including statistical features, fractal dimension (FD), Hjorth parameters, higher order spectra (HOS), and those derived using wavelet analysis. Performance was evaluated using two classifier methods, support vector machine (SVM) and classification and regression tree (CART), across five independent and publicly available datasets linking EEG to emotional states: MAHNOB-HCI, DEAP, SEED, AMIGOS, and DREAMER. The FD-CART feature-classification method attained the best mean classification accuracy for valence (85.06%) and arousal (84.55%) across the five datasets. The stability of these findings across the five different datasets also indicate that FD features derived from EEG data are reliable for emotion recognition. The results may lead to the possible development of an online feature extraction framework, thereby enabling the development of an EEG-based emotion recognition system in real time.
format Online
Article
Text
id pubmed-9867328
institution National Center for Biotechnology Information
language English
publishDate 2023
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-98673282023-01-22 Comprehensive Analysis of Feature Extraction Methods for Emotion Recognition from Multichannel EEG Recordings Yuvaraj, Rajamanickam Thagavel, Prasanth Thomas, John Fogarty, Jack Ali, Farhan Sensors (Basel) Article Advances in signal processing and machine learning have expedited electroencephalogram (EEG)-based emotion recognition research, and numerous EEG signal features have been investigated to detect or characterize human emotions. However, most studies in this area have used relatively small monocentric data and focused on a limited range of EEG features, making it difficult to compare the utility of different sets of EEG features for emotion recognition. This study addressed that by comparing the classification accuracy (performance) of a comprehensive range of EEG feature sets for identifying emotional states, in terms of valence and arousal. The classification accuracy of five EEG feature sets were investigated, including statistical features, fractal dimension (FD), Hjorth parameters, higher order spectra (HOS), and those derived using wavelet analysis. Performance was evaluated using two classifier methods, support vector machine (SVM) and classification and regression tree (CART), across five independent and publicly available datasets linking EEG to emotional states: MAHNOB-HCI, DEAP, SEED, AMIGOS, and DREAMER. The FD-CART feature-classification method attained the best mean classification accuracy for valence (85.06%) and arousal (84.55%) across the five datasets. The stability of these findings across the five different datasets also indicate that FD features derived from EEG data are reliable for emotion recognition. The results may lead to the possible development of an online feature extraction framework, thereby enabling the development of an EEG-based emotion recognition system in real time. MDPI 2023-01-12 /pmc/articles/PMC9867328/ /pubmed/36679710 http://dx.doi.org/10.3390/s23020915 Text en © 2023 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Yuvaraj, Rajamanickam
Thagavel, Prasanth
Thomas, John
Fogarty, Jack
Ali, Farhan
Comprehensive Analysis of Feature Extraction Methods for Emotion Recognition from Multichannel EEG Recordings
title Comprehensive Analysis of Feature Extraction Methods for Emotion Recognition from Multichannel EEG Recordings
title_full Comprehensive Analysis of Feature Extraction Methods for Emotion Recognition from Multichannel EEG Recordings
title_fullStr Comprehensive Analysis of Feature Extraction Methods for Emotion Recognition from Multichannel EEG Recordings
title_full_unstemmed Comprehensive Analysis of Feature Extraction Methods for Emotion Recognition from Multichannel EEG Recordings
title_short Comprehensive Analysis of Feature Extraction Methods for Emotion Recognition from Multichannel EEG Recordings
title_sort comprehensive analysis of feature extraction methods for emotion recognition from multichannel eeg recordings
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9867328/
https://www.ncbi.nlm.nih.gov/pubmed/36679710
http://dx.doi.org/10.3390/s23020915
work_keys_str_mv AT yuvarajrajamanickam comprehensiveanalysisoffeatureextractionmethodsforemotionrecognitionfrommultichanneleegrecordings
AT thagavelprasanth comprehensiveanalysisoffeatureextractionmethodsforemotionrecognitionfrommultichanneleegrecordings
AT thomasjohn comprehensiveanalysisoffeatureextractionmethodsforemotionrecognitionfrommultichanneleegrecordings
AT fogartyjack comprehensiveanalysisoffeatureextractionmethodsforemotionrecognitionfrommultichanneleegrecordings
AT alifarhan comprehensiveanalysisoffeatureextractionmethodsforemotionrecognitionfrommultichanneleegrecordings