Cargando…
Multi-Feature Input Deep Forest for EEG-Based Emotion Recognition
Due to the rapid development of human–computer interaction, affective computing has attracted more and more attention in recent years. In emotion recognition, Electroencephalogram (EEG) signals are easier to be recorded than other physiological experiments and are not easily camouflaged. Because of...
Autores principales: | , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Frontiers Media S.A.
2021
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7829220/ https://www.ncbi.nlm.nih.gov/pubmed/33505263 http://dx.doi.org/10.3389/fnbot.2020.617531 |
_version_ | 1783641141643575296 |
---|---|
author | Fang, Yinfeng Yang, Haiyang Zhang, Xuguang Liu, Han Tao, Bo |
author_facet | Fang, Yinfeng Yang, Haiyang Zhang, Xuguang Liu, Han Tao, Bo |
author_sort | Fang, Yinfeng |
collection | PubMed |
description | Due to the rapid development of human–computer interaction, affective computing has attracted more and more attention in recent years. In emotion recognition, Electroencephalogram (EEG) signals are easier to be recorded than other physiological experiments and are not easily camouflaged. Because of the high dimensional nature of EEG data and the diversity of human emotions, it is difficult to extract effective EEG features and recognize the emotion patterns. This paper proposes a multi-feature deep forest (MFDF) model to identify human emotions. The EEG signals are firstly divided into several EEG frequency bands and then extract the power spectral density (PSD) and differential entropy (DE) from each frequency band and the original signal as features. A five-class emotion model is used to mark five emotions, including neutral, angry, sad, happy, and pleasant. With either original features or dimension reduced features as input, the deep forest is constructed to classify the five emotions. These experiments are conducted on a public dataset for emotion analysis using physiological signals (DEAP). The experimental results are compared with traditional classifiers, including K Nearest Neighbors (KNN), Random Forest (RF), and Support Vector Machine (SVM). The MFDF achieves the average recognition accuracy of 71.05%, which is 3.40%, 8.54%, and 19.53% higher than RF, KNN, and SVM, respectively. Besides, the accuracies with the input of features after dimension reduction and raw EEG signal are only 51.30 and 26.71%, respectively. The result of this study shows that the method can effectively contribute to EEG-based emotion classification tasks. |
format | Online Article Text |
id | pubmed-7829220 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2021 |
publisher | Frontiers Media S.A. |
record_format | MEDLINE/PubMed |
spelling | pubmed-78292202021-01-26 Multi-Feature Input Deep Forest for EEG-Based Emotion Recognition Fang, Yinfeng Yang, Haiyang Zhang, Xuguang Liu, Han Tao, Bo Front Neurorobot Neuroscience Due to the rapid development of human–computer interaction, affective computing has attracted more and more attention in recent years. In emotion recognition, Electroencephalogram (EEG) signals are easier to be recorded than other physiological experiments and are not easily camouflaged. Because of the high dimensional nature of EEG data and the diversity of human emotions, it is difficult to extract effective EEG features and recognize the emotion patterns. This paper proposes a multi-feature deep forest (MFDF) model to identify human emotions. The EEG signals are firstly divided into several EEG frequency bands and then extract the power spectral density (PSD) and differential entropy (DE) from each frequency band and the original signal as features. A five-class emotion model is used to mark five emotions, including neutral, angry, sad, happy, and pleasant. With either original features or dimension reduced features as input, the deep forest is constructed to classify the five emotions. These experiments are conducted on a public dataset for emotion analysis using physiological signals (DEAP). The experimental results are compared with traditional classifiers, including K Nearest Neighbors (KNN), Random Forest (RF), and Support Vector Machine (SVM). The MFDF achieves the average recognition accuracy of 71.05%, which is 3.40%, 8.54%, and 19.53% higher than RF, KNN, and SVM, respectively. Besides, the accuracies with the input of features after dimension reduction and raw EEG signal are only 51.30 and 26.71%, respectively. The result of this study shows that the method can effectively contribute to EEG-based emotion classification tasks. Frontiers Media S.A. 2021-01-11 /pmc/articles/PMC7829220/ /pubmed/33505263 http://dx.doi.org/10.3389/fnbot.2020.617531 Text en Copyright © 2021 Fang, Yang, Zhang, Liu and Tao. http://creativecommons.org/licenses/by/4.0/ This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms. |
spellingShingle | Neuroscience Fang, Yinfeng Yang, Haiyang Zhang, Xuguang Liu, Han Tao, Bo Multi-Feature Input Deep Forest for EEG-Based Emotion Recognition |
title | Multi-Feature Input Deep Forest for EEG-Based Emotion Recognition |
title_full | Multi-Feature Input Deep Forest for EEG-Based Emotion Recognition |
title_fullStr | Multi-Feature Input Deep Forest for EEG-Based Emotion Recognition |
title_full_unstemmed | Multi-Feature Input Deep Forest for EEG-Based Emotion Recognition |
title_short | Multi-Feature Input Deep Forest for EEG-Based Emotion Recognition |
title_sort | multi-feature input deep forest for eeg-based emotion recognition |
topic | Neuroscience |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7829220/ https://www.ncbi.nlm.nih.gov/pubmed/33505263 http://dx.doi.org/10.3389/fnbot.2020.617531 |
work_keys_str_mv | AT fangyinfeng multifeatureinputdeepforestforeegbasedemotionrecognition AT yanghaiyang multifeatureinputdeepforestforeegbasedemotionrecognition AT zhangxuguang multifeatureinputdeepforestforeegbasedemotionrecognition AT liuhan multifeatureinputdeepforestforeegbasedemotionrecognition AT taobo multifeatureinputdeepforestforeegbasedemotionrecognition |