Cargando…

STGATE: Spatial-temporal graph attention network with a transformer encoder for EEG-based emotion recognition

Electroencephalogram (EEG) is a crucial and widely utilized technique in neuroscience research. In this paper, we introduce a novel graph neural network called the spatial-temporal graph attention network with a transformer encoder (STGATE) to learn graph representations of emotion EEG signals and i...

Descripción completa

Detalles Bibliográficos
Autores principales: Li, Jingcong, Pan, Weijian, Huang, Haiyun, Pan, Jiahui, Wang, Fei
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Frontiers Media S.A. 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10133470/
https://www.ncbi.nlm.nih.gov/pubmed/37125349
http://dx.doi.org/10.3389/fnhum.2023.1169949
_version_ 1785031572266680320
author Li, Jingcong
Pan, Weijian
Huang, Haiyun
Pan, Jiahui
Wang, Fei
author_facet Li, Jingcong
Pan, Weijian
Huang, Haiyun
Pan, Jiahui
Wang, Fei
author_sort Li, Jingcong
collection PubMed
description Electroencephalogram (EEG) is a crucial and widely utilized technique in neuroscience research. In this paper, we introduce a novel graph neural network called the spatial-temporal graph attention network with a transformer encoder (STGATE) to learn graph representations of emotion EEG signals and improve emotion recognition performance. In STGATE, a transformer-encoder is applied for capturing time-frequency features which are fed into a spatial-temporal graph attention for emotion classification. Using a dynamic adjacency matrix, the proposed STGATE adaptively learns intrinsic connections between different EEG channels. To evaluate the cross-subject emotion recognition performance, leave-one-subject-out experiments are carried out on three public emotion recognition datasets, i.e., SEED, SEED-IV, and DREAMER. The proposed STGATE model achieved a state-of-the-art EEG-based emotion recognition performance accuracy of 90.37% in SEED, 76.43% in SEED-IV, and 76.35% in DREAMER dataset, respectively. The experiments demonstrated the effectiveness of the proposed STGATE model for cross-subject EEG emotion recognition and its potential for graph-based neuroscience research.
format Online
Article
Text
id pubmed-10133470
institution National Center for Biotechnology Information
language English
publishDate 2023
publisher Frontiers Media S.A.
record_format MEDLINE/PubMed
spelling pubmed-101334702023-04-28 STGATE: Spatial-temporal graph attention network with a transformer encoder for EEG-based emotion recognition Li, Jingcong Pan, Weijian Huang, Haiyun Pan, Jiahui Wang, Fei Front Hum Neurosci Human Neuroscience Electroencephalogram (EEG) is a crucial and widely utilized technique in neuroscience research. In this paper, we introduce a novel graph neural network called the spatial-temporal graph attention network with a transformer encoder (STGATE) to learn graph representations of emotion EEG signals and improve emotion recognition performance. In STGATE, a transformer-encoder is applied for capturing time-frequency features which are fed into a spatial-temporal graph attention for emotion classification. Using a dynamic adjacency matrix, the proposed STGATE adaptively learns intrinsic connections between different EEG channels. To evaluate the cross-subject emotion recognition performance, leave-one-subject-out experiments are carried out on three public emotion recognition datasets, i.e., SEED, SEED-IV, and DREAMER. The proposed STGATE model achieved a state-of-the-art EEG-based emotion recognition performance accuracy of 90.37% in SEED, 76.43% in SEED-IV, and 76.35% in DREAMER dataset, respectively. The experiments demonstrated the effectiveness of the proposed STGATE model for cross-subject EEG emotion recognition and its potential for graph-based neuroscience research. Frontiers Media S.A. 2023-04-13 /pmc/articles/PMC10133470/ /pubmed/37125349 http://dx.doi.org/10.3389/fnhum.2023.1169949 Text en Copyright © 2023 Li, Pan, Huang, Pan and Wang. https://creativecommons.org/licenses/by/4.0/This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
spellingShingle Human Neuroscience
Li, Jingcong
Pan, Weijian
Huang, Haiyun
Pan, Jiahui
Wang, Fei
STGATE: Spatial-temporal graph attention network with a transformer encoder for EEG-based emotion recognition
title STGATE: Spatial-temporal graph attention network with a transformer encoder for EEG-based emotion recognition
title_full STGATE: Spatial-temporal graph attention network with a transformer encoder for EEG-based emotion recognition
title_fullStr STGATE: Spatial-temporal graph attention network with a transformer encoder for EEG-based emotion recognition
title_full_unstemmed STGATE: Spatial-temporal graph attention network with a transformer encoder for EEG-based emotion recognition
title_short STGATE: Spatial-temporal graph attention network with a transformer encoder for EEG-based emotion recognition
title_sort stgate: spatial-temporal graph attention network with a transformer encoder for eeg-based emotion recognition
topic Human Neuroscience
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10133470/
https://www.ncbi.nlm.nih.gov/pubmed/37125349
http://dx.doi.org/10.3389/fnhum.2023.1169949
work_keys_str_mv AT lijingcong stgatespatialtemporalgraphattentionnetworkwithatransformerencoderforeegbasedemotionrecognition
AT panweijian stgatespatialtemporalgraphattentionnetworkwithatransformerencoderforeegbasedemotionrecognition
AT huanghaiyun stgatespatialtemporalgraphattentionnetworkwithatransformerencoderforeegbasedemotionrecognition
AT panjiahui stgatespatialtemporalgraphattentionnetworkwithatransformerencoderforeegbasedemotionrecognition
AT wangfei stgatespatialtemporalgraphattentionnetworkwithatransformerencoderforeegbasedemotionrecognition