Cargando…

EEG Emotion Classification Network Based on Attention Fusion of Multi-Channel Band Features

Understanding learners’ emotions can help optimize instruction sand further conduct effective learning interventions. Most existing studies on student emotion recognition are based on multiple manifestations of external behavior, which do not fully use physiological signals. In this context, on the...

Descripción completa

Detalles Bibliográficos
Autores principales: Zhu, Xiaoliang, Rong, Wenting, Zhao, Liang, He, Zili, Yang, Qiaolai, Sun, Junyi, Liu, Gendong
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9318779/
https://www.ncbi.nlm.nih.gov/pubmed/35890933
http://dx.doi.org/10.3390/s22145252
_version_ 1784755377724719104
author Zhu, Xiaoliang
Rong, Wenting
Zhao, Liang
He, Zili
Yang, Qiaolai
Sun, Junyi
Liu, Gendong
author_facet Zhu, Xiaoliang
Rong, Wenting
Zhao, Liang
He, Zili
Yang, Qiaolai
Sun, Junyi
Liu, Gendong
author_sort Zhu, Xiaoliang
collection PubMed
description Understanding learners’ emotions can help optimize instruction sand further conduct effective learning interventions. Most existing studies on student emotion recognition are based on multiple manifestations of external behavior, which do not fully use physiological signals. In this context, on the one hand, a learning emotion EEG dataset (LE-EEG) is constructed, which captures physiological signals reflecting the emotions of boredom, neutrality, and engagement during learning; on the other hand, an EEG emotion classification network based on attention fusion (ECN-AF) is proposed. To be specific, on the basis of key frequency bands and channels selection, multi-channel band features are first extracted (using a multi-channel backbone network) and then fused (using attention units). In order to verify the performance, the proposed model is tested on an open-access dataset SEED (N = 15) and the self-collected dataset LE-EEG (N = 45), respectively. The experimental results using five-fold cross validation show the following: (i) on the SEED dataset, the highest accuracy of 96.45% is achieved by the proposed model, demonstrating a slight increase of 1.37% compared to the baseline models; and (ii) on the LE-EEG dataset, the highest accuracy of 95.87% is achieved, demonstrating a 21.49% increase compared to the baseline models.
format Online
Article
Text
id pubmed-9318779
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-93187792022-07-27 EEG Emotion Classification Network Based on Attention Fusion of Multi-Channel Band Features Zhu, Xiaoliang Rong, Wenting Zhao, Liang He, Zili Yang, Qiaolai Sun, Junyi Liu, Gendong Sensors (Basel) Article Understanding learners’ emotions can help optimize instruction sand further conduct effective learning interventions. Most existing studies on student emotion recognition are based on multiple manifestations of external behavior, which do not fully use physiological signals. In this context, on the one hand, a learning emotion EEG dataset (LE-EEG) is constructed, which captures physiological signals reflecting the emotions of boredom, neutrality, and engagement during learning; on the other hand, an EEG emotion classification network based on attention fusion (ECN-AF) is proposed. To be specific, on the basis of key frequency bands and channels selection, multi-channel band features are first extracted (using a multi-channel backbone network) and then fused (using attention units). In order to verify the performance, the proposed model is tested on an open-access dataset SEED (N = 15) and the self-collected dataset LE-EEG (N = 45), respectively. The experimental results using five-fold cross validation show the following: (i) on the SEED dataset, the highest accuracy of 96.45% is achieved by the proposed model, demonstrating a slight increase of 1.37% compared to the baseline models; and (ii) on the LE-EEG dataset, the highest accuracy of 95.87% is achieved, demonstrating a 21.49% increase compared to the baseline models. MDPI 2022-07-13 /pmc/articles/PMC9318779/ /pubmed/35890933 http://dx.doi.org/10.3390/s22145252 Text en © 2022 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Zhu, Xiaoliang
Rong, Wenting
Zhao, Liang
He, Zili
Yang, Qiaolai
Sun, Junyi
Liu, Gendong
EEG Emotion Classification Network Based on Attention Fusion of Multi-Channel Band Features
title EEG Emotion Classification Network Based on Attention Fusion of Multi-Channel Band Features
title_full EEG Emotion Classification Network Based on Attention Fusion of Multi-Channel Band Features
title_fullStr EEG Emotion Classification Network Based on Attention Fusion of Multi-Channel Band Features
title_full_unstemmed EEG Emotion Classification Network Based on Attention Fusion of Multi-Channel Band Features
title_short EEG Emotion Classification Network Based on Attention Fusion of Multi-Channel Band Features
title_sort eeg emotion classification network based on attention fusion of multi-channel band features
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9318779/
https://www.ncbi.nlm.nih.gov/pubmed/35890933
http://dx.doi.org/10.3390/s22145252
work_keys_str_mv AT zhuxiaoliang eegemotionclassificationnetworkbasedonattentionfusionofmultichannelbandfeatures
AT rongwenting eegemotionclassificationnetworkbasedonattentionfusionofmultichannelbandfeatures
AT zhaoliang eegemotionclassificationnetworkbasedonattentionfusionofmultichannelbandfeatures
AT hezili eegemotionclassificationnetworkbasedonattentionfusionofmultichannelbandfeatures
AT yangqiaolai eegemotionclassificationnetworkbasedonattentionfusionofmultichannelbandfeatures
AT sunjunyi eegemotionclassificationnetworkbasedonattentionfusionofmultichannelbandfeatures
AT liugendong eegemotionclassificationnetworkbasedonattentionfusionofmultichannelbandfeatures