Cargando…

CNN-XGBoost fusion-based affective state recognition using EEG spectrogram image analysis

Recognizing emotional state of human using brain signal is an active research domain with several open challenges. In this research, we propose a signal spectrogram image based CNN-XGBoost fusion method for recognising three dimensions of emotion, namely arousal (calm or excitement), valence (positi...

Descripción completa

Detalles Bibliográficos
Autores principales: Khan, Md. Sakib, Salsabil, Nishat, Alam, Md. Golam Rabiul, Dewan, M. Ali Akber, Uddin, Md. Zia
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Nature Publishing Group UK 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9391364/
https://www.ncbi.nlm.nih.gov/pubmed/35986065
http://dx.doi.org/10.1038/s41598-022-18257-x
_version_ 1784770841972572160
author Khan, Md. Sakib
Salsabil, Nishat
Alam, Md. Golam Rabiul
Dewan, M. Ali Akber
Uddin, Md. Zia
author_facet Khan, Md. Sakib
Salsabil, Nishat
Alam, Md. Golam Rabiul
Dewan, M. Ali Akber
Uddin, Md. Zia
author_sort Khan, Md. Sakib
collection PubMed
description Recognizing emotional state of human using brain signal is an active research domain with several open challenges. In this research, we propose a signal spectrogram image based CNN-XGBoost fusion method for recognising three dimensions of emotion, namely arousal (calm or excitement), valence (positive or negative feeling) and dominance (without control or empowered). We used a benchmark dataset called DREAMER where the EEG signals were collected from multiple stimulus along with self-evaluation ratings. In our proposed method, we first calculate the Short-Time Fourier Transform (STFT) of the EEG signals and convert them into RGB images to obtain the spectrograms. Then we use a two dimensional Convolutional Neural Network (CNN) in order to train the model on the spectrogram images and retrieve the features from the trained layer of the CNN using a dense layer of the neural network. We apply Extreme Gradient Boosting (XGBoost) classifier on extracted CNN features to classify the signals into arousal, valence and dominance of human emotion. We compare our results with the feature fusion-based state-of-the-art approaches of emotion recognition. To do this, we applied various feature extraction techniques on the signals which include Fast Fourier Transformation, Discrete Cosine Transformation, Poincare, Power Spectral Density, Hjorth parameters and some statistical features. Additionally, we use Chi-square and Recursive Feature Elimination techniques to select the discriminative features. We form the feature vectors by applying feature level fusion, and apply Support Vector Machine (SVM) and Extreme Gradient Boosting (XGBoost) classifiers on the fused features to classify different emotion levels. The performance study shows that the proposed spectrogram image based CNN-XGBoost fusion method outperforms the feature fusion-based SVM and XGBoost methods. The proposed method obtained the accuracy of 99.712% for arousal, 99.770% for valence and 99.770% for dominance in human emotion detection.
format Online
Article
Text
id pubmed-9391364
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher Nature Publishing Group UK
record_format MEDLINE/PubMed
spelling pubmed-93913642022-08-21 CNN-XGBoost fusion-based affective state recognition using EEG spectrogram image analysis Khan, Md. Sakib Salsabil, Nishat Alam, Md. Golam Rabiul Dewan, M. Ali Akber Uddin, Md. Zia Sci Rep Article Recognizing emotional state of human using brain signal is an active research domain with several open challenges. In this research, we propose a signal spectrogram image based CNN-XGBoost fusion method for recognising three dimensions of emotion, namely arousal (calm or excitement), valence (positive or negative feeling) and dominance (without control or empowered). We used a benchmark dataset called DREAMER where the EEG signals were collected from multiple stimulus along with self-evaluation ratings. In our proposed method, we first calculate the Short-Time Fourier Transform (STFT) of the EEG signals and convert them into RGB images to obtain the spectrograms. Then we use a two dimensional Convolutional Neural Network (CNN) in order to train the model on the spectrogram images and retrieve the features from the trained layer of the CNN using a dense layer of the neural network. We apply Extreme Gradient Boosting (XGBoost) classifier on extracted CNN features to classify the signals into arousal, valence and dominance of human emotion. We compare our results with the feature fusion-based state-of-the-art approaches of emotion recognition. To do this, we applied various feature extraction techniques on the signals which include Fast Fourier Transformation, Discrete Cosine Transformation, Poincare, Power Spectral Density, Hjorth parameters and some statistical features. Additionally, we use Chi-square and Recursive Feature Elimination techniques to select the discriminative features. We form the feature vectors by applying feature level fusion, and apply Support Vector Machine (SVM) and Extreme Gradient Boosting (XGBoost) classifiers on the fused features to classify different emotion levels. The performance study shows that the proposed spectrogram image based CNN-XGBoost fusion method outperforms the feature fusion-based SVM and XGBoost methods. The proposed method obtained the accuracy of 99.712% for arousal, 99.770% for valence and 99.770% for dominance in human emotion detection. Nature Publishing Group UK 2022-08-19 /pmc/articles/PMC9391364/ /pubmed/35986065 http://dx.doi.org/10.1038/s41598-022-18257-x Text en © The Author(s) 2022 https://creativecommons.org/licenses/by/4.0/Open AccessThis article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) .
spellingShingle Article
Khan, Md. Sakib
Salsabil, Nishat
Alam, Md. Golam Rabiul
Dewan, M. Ali Akber
Uddin, Md. Zia
CNN-XGBoost fusion-based affective state recognition using EEG spectrogram image analysis
title CNN-XGBoost fusion-based affective state recognition using EEG spectrogram image analysis
title_full CNN-XGBoost fusion-based affective state recognition using EEG spectrogram image analysis
title_fullStr CNN-XGBoost fusion-based affective state recognition using EEG spectrogram image analysis
title_full_unstemmed CNN-XGBoost fusion-based affective state recognition using EEG spectrogram image analysis
title_short CNN-XGBoost fusion-based affective state recognition using EEG spectrogram image analysis
title_sort cnn-xgboost fusion-based affective state recognition using eeg spectrogram image analysis
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9391364/
https://www.ncbi.nlm.nih.gov/pubmed/35986065
http://dx.doi.org/10.1038/s41598-022-18257-x
work_keys_str_mv AT khanmdsakib cnnxgboostfusionbasedaffectivestaterecognitionusingeegspectrogramimageanalysis
AT salsabilnishat cnnxgboostfusionbasedaffectivestaterecognitionusingeegspectrogramimageanalysis
AT alammdgolamrabiul cnnxgboostfusionbasedaffectivestaterecognitionusingeegspectrogramimageanalysis
AT dewanmaliakber cnnxgboostfusionbasedaffectivestaterecognitionusingeegspectrogramimageanalysis
AT uddinmdzia cnnxgboostfusionbasedaffectivestaterecognitionusingeegspectrogramimageanalysis