Cargando…

CR-GCN: Channel-Relationships-Based Graph Convolutional Network for EEG Emotion Recognition

Electroencephalography (EEG) is recorded by electrodes from different areas of the brain and is commonly used to measure neuronal activity. EEG-based methods have been widely used for emotion recognition recently. However, most current methods for EEG-based emotion recognition do not fully exploit t...

Descripción completa

Detalles Bibliográficos
Autores principales: Jia, Jingjing, Zhang, Bofeng, Lv, Hehe, Xu, Zhikang, Hu, Shengxiang, Li, Haiyan
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9394289/
https://www.ncbi.nlm.nih.gov/pubmed/35892427
http://dx.doi.org/10.3390/brainsci12080987
_version_ 1784771456042795008
author Jia, Jingjing
Zhang, Bofeng
Lv, Hehe
Xu, Zhikang
Hu, Shengxiang
Li, Haiyan
author_facet Jia, Jingjing
Zhang, Bofeng
Lv, Hehe
Xu, Zhikang
Hu, Shengxiang
Li, Haiyan
author_sort Jia, Jingjing
collection PubMed
description Electroencephalography (EEG) is recorded by electrodes from different areas of the brain and is commonly used to measure neuronal activity. EEG-based methods have been widely used for emotion recognition recently. However, most current methods for EEG-based emotion recognition do not fully exploit the relationship of EEG channels, which affects the precision of emotion recognition. To address the issue, in this paper, we propose a novel method for EEG-based emotion recognition called CR-GCN: Channel-Relationships-based Graph Convolutional Network. Specifically, topological structure of EEG channels is distance-based and tends to capture local relationships, and brain functional connectivity tends to capture global relationships among EEG channels. Therefore, in this paper, we construct EEG channel relationships using an adjacency matrix in graph convolutional network where the adjacency matrix captures both local and global relationships among different EEG channels. Extensive experiments demonstrate that CR-GCN method significantly outperforms the state-of-the-art methods. In subject-dependent experiments, the average classification accuracies of 94.69% and 93.95% are achieved for valence and arousal. In subject-independent experiments, the average classification accuracies of 94.78% and 93.46% are obtained for valence and arousal.
format Online
Article
Text
id pubmed-9394289
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-93942892022-08-23 CR-GCN: Channel-Relationships-Based Graph Convolutional Network for EEG Emotion Recognition Jia, Jingjing Zhang, Bofeng Lv, Hehe Xu, Zhikang Hu, Shengxiang Li, Haiyan Brain Sci Article Electroencephalography (EEG) is recorded by electrodes from different areas of the brain and is commonly used to measure neuronal activity. EEG-based methods have been widely used for emotion recognition recently. However, most current methods for EEG-based emotion recognition do not fully exploit the relationship of EEG channels, which affects the precision of emotion recognition. To address the issue, in this paper, we propose a novel method for EEG-based emotion recognition called CR-GCN: Channel-Relationships-based Graph Convolutional Network. Specifically, topological structure of EEG channels is distance-based and tends to capture local relationships, and brain functional connectivity tends to capture global relationships among EEG channels. Therefore, in this paper, we construct EEG channel relationships using an adjacency matrix in graph convolutional network where the adjacency matrix captures both local and global relationships among different EEG channels. Extensive experiments demonstrate that CR-GCN method significantly outperforms the state-of-the-art methods. In subject-dependent experiments, the average classification accuracies of 94.69% and 93.95% are achieved for valence and arousal. In subject-independent experiments, the average classification accuracies of 94.78% and 93.46% are obtained for valence and arousal. MDPI 2022-07-26 /pmc/articles/PMC9394289/ /pubmed/35892427 http://dx.doi.org/10.3390/brainsci12080987 Text en © 2022 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Jia, Jingjing
Zhang, Bofeng
Lv, Hehe
Xu, Zhikang
Hu, Shengxiang
Li, Haiyan
CR-GCN: Channel-Relationships-Based Graph Convolutional Network for EEG Emotion Recognition
title CR-GCN: Channel-Relationships-Based Graph Convolutional Network for EEG Emotion Recognition
title_full CR-GCN: Channel-Relationships-Based Graph Convolutional Network for EEG Emotion Recognition
title_fullStr CR-GCN: Channel-Relationships-Based Graph Convolutional Network for EEG Emotion Recognition
title_full_unstemmed CR-GCN: Channel-Relationships-Based Graph Convolutional Network for EEG Emotion Recognition
title_short CR-GCN: Channel-Relationships-Based Graph Convolutional Network for EEG Emotion Recognition
title_sort cr-gcn: channel-relationships-based graph convolutional network for eeg emotion recognition
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9394289/
https://www.ncbi.nlm.nih.gov/pubmed/35892427
http://dx.doi.org/10.3390/brainsci12080987
work_keys_str_mv AT jiajingjing crgcnchannelrelationshipsbasedgraphconvolutionalnetworkforeegemotionrecognition
AT zhangbofeng crgcnchannelrelationshipsbasedgraphconvolutionalnetworkforeegemotionrecognition
AT lvhehe crgcnchannelrelationshipsbasedgraphconvolutionalnetworkforeegemotionrecognition
AT xuzhikang crgcnchannelrelationshipsbasedgraphconvolutionalnetworkforeegemotionrecognition
AT hushengxiang crgcnchannelrelationshipsbasedgraphconvolutionalnetworkforeegemotionrecognition
AT lihaiyan crgcnchannelrelationshipsbasedgraphconvolutionalnetworkforeegemotionrecognition