Cargando…

Multi-Scale Frequency Bands Ensemble Learning for EEG-Based Emotion Recognition

Emotion recognition has a wide range of potential applications in the real world. Among the emotion recognition data sources, electroencephalography (EEG) signals can record the neural activities across the human brain, providing us a reliable way to recognize the emotional states. Most of existing...

Descripción completa

Detalles Bibliográficos
Autores principales: Shen, Fangyao, Peng, Yong, Kong, Wanzeng, Dai, Guojun
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2021
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7916620/
https://www.ncbi.nlm.nih.gov/pubmed/33578835
http://dx.doi.org/10.3390/s21041262
_version_ 1783657519320662016
author Shen, Fangyao
Peng, Yong
Kong, Wanzeng
Dai, Guojun
author_facet Shen, Fangyao
Peng, Yong
Kong, Wanzeng
Dai, Guojun
author_sort Shen, Fangyao
collection PubMed
description Emotion recognition has a wide range of potential applications in the real world. Among the emotion recognition data sources, electroencephalography (EEG) signals can record the neural activities across the human brain, providing us a reliable way to recognize the emotional states. Most of existing EEG-based emotion recognition studies directly concatenated features extracted from all EEG frequency bands for emotion classification. This way assumes that all frequency bands share the same importance by default; however, it cannot always obtain the optimal performance. In this paper, we present a novel multi-scale frequency bands ensemble learning (MSFBEL) method to perform emotion recognition from EEG signals. Concretely, we first re-organize all frequency bands into several local scales and one global scale. Then we train a base classifier on each scale. Finally we fuse the results of all scales by designing an adaptive weight learning method which automatically assigns larger weights to more important scales to further improve the performance. The proposed method is validated on two public data sets. For the “SEED IV” data set, MSFBEL achieves average accuracies of 82.75%, 87.87%, and 78.27% on the three sessions under the within-session experimental paradigm. For the “DEAP” data set, it obtains average accuracy of 74.22% for four-category classification under 5-fold cross validation. The experimental results demonstrate that the scale of frequency bands influences the emotion recognition rate, while the global scale that directly concatenating all frequency bands cannot always guarantee to obtain the best emotion recognition performance. Different scales provide complementary information to each other, and the proposed adaptive weight learning method can effectively fuse them to further enhance the performance.
format Online
Article
Text
id pubmed-7916620
institution National Center for Biotechnology Information
language English
publishDate 2021
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-79166202021-03-01 Multi-Scale Frequency Bands Ensemble Learning for EEG-Based Emotion Recognition Shen, Fangyao Peng, Yong Kong, Wanzeng Dai, Guojun Sensors (Basel) Article Emotion recognition has a wide range of potential applications in the real world. Among the emotion recognition data sources, electroencephalography (EEG) signals can record the neural activities across the human brain, providing us a reliable way to recognize the emotional states. Most of existing EEG-based emotion recognition studies directly concatenated features extracted from all EEG frequency bands for emotion classification. This way assumes that all frequency bands share the same importance by default; however, it cannot always obtain the optimal performance. In this paper, we present a novel multi-scale frequency bands ensemble learning (MSFBEL) method to perform emotion recognition from EEG signals. Concretely, we first re-organize all frequency bands into several local scales and one global scale. Then we train a base classifier on each scale. Finally we fuse the results of all scales by designing an adaptive weight learning method which automatically assigns larger weights to more important scales to further improve the performance. The proposed method is validated on two public data sets. For the “SEED IV” data set, MSFBEL achieves average accuracies of 82.75%, 87.87%, and 78.27% on the three sessions under the within-session experimental paradigm. For the “DEAP” data set, it obtains average accuracy of 74.22% for four-category classification under 5-fold cross validation. The experimental results demonstrate that the scale of frequency bands influences the emotion recognition rate, while the global scale that directly concatenating all frequency bands cannot always guarantee to obtain the best emotion recognition performance. Different scales provide complementary information to each other, and the proposed adaptive weight learning method can effectively fuse them to further enhance the performance. MDPI 2021-02-10 /pmc/articles/PMC7916620/ /pubmed/33578835 http://dx.doi.org/10.3390/s21041262 Text en © 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Shen, Fangyao
Peng, Yong
Kong, Wanzeng
Dai, Guojun
Multi-Scale Frequency Bands Ensemble Learning for EEG-Based Emotion Recognition
title Multi-Scale Frequency Bands Ensemble Learning for EEG-Based Emotion Recognition
title_full Multi-Scale Frequency Bands Ensemble Learning for EEG-Based Emotion Recognition
title_fullStr Multi-Scale Frequency Bands Ensemble Learning for EEG-Based Emotion Recognition
title_full_unstemmed Multi-Scale Frequency Bands Ensemble Learning for EEG-Based Emotion Recognition
title_short Multi-Scale Frequency Bands Ensemble Learning for EEG-Based Emotion Recognition
title_sort multi-scale frequency bands ensemble learning for eeg-based emotion recognition
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7916620/
https://www.ncbi.nlm.nih.gov/pubmed/33578835
http://dx.doi.org/10.3390/s21041262
work_keys_str_mv AT shenfangyao multiscalefrequencybandsensemblelearningforeegbasedemotionrecognition
AT pengyong multiscalefrequencybandsensemblelearningforeegbasedemotionrecognition
AT kongwanzeng multiscalefrequencybandsensemblelearningforeegbasedemotionrecognition
AT daiguojun multiscalefrequencybandsensemblelearningforeegbasedemotionrecognition