Cargando…

Interpretable Cross-Subject EEG-Based Emotion Recognition Using Channel-Wise Features(†)

Electroencephalogram (EEG)-based emotion recognition is receiving significant attention in research on brain-computer interfaces (BCI) and health care. To recognize cross-subject emotion from EEG data accurately, a technique capable of finding an effective representation robust to the subject-specif...

Descripción completa

Detalles Bibliográficos
Autores principales: Jin, Longbin, Kim, Eun Yi
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2020
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7727848/
https://www.ncbi.nlm.nih.gov/pubmed/33255374
http://dx.doi.org/10.3390/s20236719
_version_ 1783621143537647616
author Jin, Longbin
Kim, Eun Yi
author_facet Jin, Longbin
Kim, Eun Yi
author_sort Jin, Longbin
collection PubMed
description Electroencephalogram (EEG)-based emotion recognition is receiving significant attention in research on brain-computer interfaces (BCI) and health care. To recognize cross-subject emotion from EEG data accurately, a technique capable of finding an effective representation robust to the subject-specific variability associated with EEG data collection processes is necessary. In this paper, a new method to predict cross-subject emotion using time-series analysis and spatial correlation is proposed. To represent the spatial connectivity between brain regions, a channel-wise feature is proposed, which can effectively handle the correlation between all channels. The channel-wise feature is defined by a symmetric matrix, the elements of which are calculated by the Pearson correlation coefficient between two-pair channels capable of complementarily handling subject-specific variability. The channel-wise features are then fed to two-layer stacked long short-term memory (LSTM), which can extract temporal features and learn an emotional model. Extensive experiments on two publicly available datasets, the Dataset for Emotion Analysis using Physiological Signals (DEAP) and the SJTU (Shanghai Jiao Tong University) Emotion EEG Dataset (SEED), demonstrate the effectiveness of the combined use of channel-wise features and LSTM. Experimental results achieve state-of-the-art classification rates of 98.93% and 99.10% during the two-class classification of valence and arousal in DEAP, respectively, with an accuracy of 99.63% during three-class classification in SEED.
format Online
Article
Text
id pubmed-7727848
institution National Center for Biotechnology Information
language English
publishDate 2020
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-77278482020-12-11 Interpretable Cross-Subject EEG-Based Emotion Recognition Using Channel-Wise Features(†) Jin, Longbin Kim, Eun Yi Sensors (Basel) Article Electroencephalogram (EEG)-based emotion recognition is receiving significant attention in research on brain-computer interfaces (BCI) and health care. To recognize cross-subject emotion from EEG data accurately, a technique capable of finding an effective representation robust to the subject-specific variability associated with EEG data collection processes is necessary. In this paper, a new method to predict cross-subject emotion using time-series analysis and spatial correlation is proposed. To represent the spatial connectivity between brain regions, a channel-wise feature is proposed, which can effectively handle the correlation between all channels. The channel-wise feature is defined by a symmetric matrix, the elements of which are calculated by the Pearson correlation coefficient between two-pair channels capable of complementarily handling subject-specific variability. The channel-wise features are then fed to two-layer stacked long short-term memory (LSTM), which can extract temporal features and learn an emotional model. Extensive experiments on two publicly available datasets, the Dataset for Emotion Analysis using Physiological Signals (DEAP) and the SJTU (Shanghai Jiao Tong University) Emotion EEG Dataset (SEED), demonstrate the effectiveness of the combined use of channel-wise features and LSTM. Experimental results achieve state-of-the-art classification rates of 98.93% and 99.10% during the two-class classification of valence and arousal in DEAP, respectively, with an accuracy of 99.63% during three-class classification in SEED. MDPI 2020-11-24 /pmc/articles/PMC7727848/ /pubmed/33255374 http://dx.doi.org/10.3390/s20236719 Text en © 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Jin, Longbin
Kim, Eun Yi
Interpretable Cross-Subject EEG-Based Emotion Recognition Using Channel-Wise Features(†)
title Interpretable Cross-Subject EEG-Based Emotion Recognition Using Channel-Wise Features(†)
title_full Interpretable Cross-Subject EEG-Based Emotion Recognition Using Channel-Wise Features(†)
title_fullStr Interpretable Cross-Subject EEG-Based Emotion Recognition Using Channel-Wise Features(†)
title_full_unstemmed Interpretable Cross-Subject EEG-Based Emotion Recognition Using Channel-Wise Features(†)
title_short Interpretable Cross-Subject EEG-Based Emotion Recognition Using Channel-Wise Features(†)
title_sort interpretable cross-subject eeg-based emotion recognition using channel-wise features(†)
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7727848/
https://www.ncbi.nlm.nih.gov/pubmed/33255374
http://dx.doi.org/10.3390/s20236719
work_keys_str_mv AT jinlongbin interpretablecrosssubjecteegbasedemotionrecognitionusingchannelwisefeatures
AT kimeunyi interpretablecrosssubjecteegbasedemotionrecognitionusingchannelwisefeatures