Cargando…

EEG Emotion Recognition by Fusion of Multi-Scale Features

Electroencephalogram (EEG) signals exhibit low amplitude, complex background noise, randomness, and significant inter-individual differences, which pose challenges in extracting sufficient features and can lead to information loss during the mapping process from low-dimensional feature matrices to h...

Descripción completa

Detalles Bibliográficos
Autores principales: Du, Xiuli, Meng, Yifei, Qiu, Shaoming, Lv, Yana, Liu, Qingli
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10526490/
https://www.ncbi.nlm.nih.gov/pubmed/37759894
http://dx.doi.org/10.3390/brainsci13091293
_version_ 1785111034581745664
author Du, Xiuli
Meng, Yifei
Qiu, Shaoming
Lv, Yana
Liu, Qingli
author_facet Du, Xiuli
Meng, Yifei
Qiu, Shaoming
Lv, Yana
Liu, Qingli
author_sort Du, Xiuli
collection PubMed
description Electroencephalogram (EEG) signals exhibit low amplitude, complex background noise, randomness, and significant inter-individual differences, which pose challenges in extracting sufficient features and can lead to information loss during the mapping process from low-dimensional feature matrices to high-dimensional ones in emotion recognition algorithms. In this paper, we propose a Multi-scale Deformable Convolutional Interacting Attention Network based on Residual Network (MDCNAResnet) for EEG-based emotion recognition. Firstly, we extract differential entropy features from different channels of EEG signals and construct a three-dimensional feature matrix based on the relative positions of electrode channels. Secondly, we utilize deformable convolution (DCN) to extract high-level abstract features by replacing standard convolution with deformable convolution, enhancing the modeling capability of the convolutional neural network for irregular targets. Then, we develop the Bottom-Up Feature Pyramid Network (BU-FPN) to extract multi-scale data features, enabling complementary information from different levels in the neural network, while optimizing the feature extraction process using Efficient Channel Attention (ECANet). Finally, we combine the MDCNAResnet with a Bidirectional Gated Recurrent Unit (BiGRU) to further capture the contextual semantic information of EEG signals. Experimental results on the DEAP dataset demonstrate the effectiveness of our approach, achieving accuracies of 98.63% and 98.89% for Valence and Arousal dimensions, respectively.
format Online
Article
Text
id pubmed-10526490
institution National Center for Biotechnology Information
language English
publishDate 2023
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-105264902023-09-28 EEG Emotion Recognition by Fusion of Multi-Scale Features Du, Xiuli Meng, Yifei Qiu, Shaoming Lv, Yana Liu, Qingli Brain Sci Article Electroencephalogram (EEG) signals exhibit low amplitude, complex background noise, randomness, and significant inter-individual differences, which pose challenges in extracting sufficient features and can lead to information loss during the mapping process from low-dimensional feature matrices to high-dimensional ones in emotion recognition algorithms. In this paper, we propose a Multi-scale Deformable Convolutional Interacting Attention Network based on Residual Network (MDCNAResnet) for EEG-based emotion recognition. Firstly, we extract differential entropy features from different channels of EEG signals and construct a three-dimensional feature matrix based on the relative positions of electrode channels. Secondly, we utilize deformable convolution (DCN) to extract high-level abstract features by replacing standard convolution with deformable convolution, enhancing the modeling capability of the convolutional neural network for irregular targets. Then, we develop the Bottom-Up Feature Pyramid Network (BU-FPN) to extract multi-scale data features, enabling complementary information from different levels in the neural network, while optimizing the feature extraction process using Efficient Channel Attention (ECANet). Finally, we combine the MDCNAResnet with a Bidirectional Gated Recurrent Unit (BiGRU) to further capture the contextual semantic information of EEG signals. Experimental results on the DEAP dataset demonstrate the effectiveness of our approach, achieving accuracies of 98.63% and 98.89% for Valence and Arousal dimensions, respectively. MDPI 2023-09-07 /pmc/articles/PMC10526490/ /pubmed/37759894 http://dx.doi.org/10.3390/brainsci13091293 Text en © 2023 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Du, Xiuli
Meng, Yifei
Qiu, Shaoming
Lv, Yana
Liu, Qingli
EEG Emotion Recognition by Fusion of Multi-Scale Features
title EEG Emotion Recognition by Fusion of Multi-Scale Features
title_full EEG Emotion Recognition by Fusion of Multi-Scale Features
title_fullStr EEG Emotion Recognition by Fusion of Multi-Scale Features
title_full_unstemmed EEG Emotion Recognition by Fusion of Multi-Scale Features
title_short EEG Emotion Recognition by Fusion of Multi-Scale Features
title_sort eeg emotion recognition by fusion of multi-scale features
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10526490/
https://www.ncbi.nlm.nih.gov/pubmed/37759894
http://dx.doi.org/10.3390/brainsci13091293
work_keys_str_mv AT duxiuli eegemotionrecognitionbyfusionofmultiscalefeatures
AT mengyifei eegemotionrecognitionbyfusionofmultiscalefeatures
AT qiushaoming eegemotionrecognitionbyfusionofmultiscalefeatures
AT lvyana eegemotionrecognitionbyfusionofmultiscalefeatures
AT liuqingli eegemotionrecognitionbyfusionofmultiscalefeatures