Cargando…

Multi-source joint domain adaptation for cross-subject and cross-session emotion recognition from electroencephalography

As an important component to promote the development of affective brain–computer interfaces, the study of emotion recognition based on electroencephalography (EEG) has encountered a difficult challenge; the distribution of EEG data changes among different subjects and at different time periods. Doma...

Descripción completa

Detalles Bibliográficos
Autores principales: Liang, Shengjin, Su, Lei, Fu, Yunfa, Wu, Liping
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Frontiers Media S.A. 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9520599/
https://www.ncbi.nlm.nih.gov/pubmed/36188181
http://dx.doi.org/10.3389/fnhum.2022.921346
_version_ 1784799661553352704
author Liang, Shengjin
Su, Lei
Fu, Yunfa
Wu, Liping
author_facet Liang, Shengjin
Su, Lei
Fu, Yunfa
Wu, Liping
author_sort Liang, Shengjin
collection PubMed
description As an important component to promote the development of affective brain–computer interfaces, the study of emotion recognition based on electroencephalography (EEG) has encountered a difficult challenge; the distribution of EEG data changes among different subjects and at different time periods. Domain adaptation methods can effectively alleviate the generalization problem of EEG emotion recognition models. However, most of them treat multiple source domains, with significantly different distributions, as one single source domain, and only adapt the cross-domain marginal distribution while ignoring the joint distribution difference between the domains. To gain the advantages of multiple source distributions, and better match the distributions of the source and target domains, this paper proposes a novel multi-source joint domain adaptation (MSJDA) network. We first map all domains to a shared feature space and then align the joint distributions of the further extracted private representations and the corresponding classification predictions for each pair of source and target domains. Extensive cross-subject and cross-session experiments on the benchmark dataset, SEED, demonstrate the effectiveness of the proposed model, where more significant classification results are obtained on the more difficult cross-subject emotion recognition task.
format Online
Article
Text
id pubmed-9520599
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher Frontiers Media S.A.
record_format MEDLINE/PubMed
spelling pubmed-95205992022-09-30 Multi-source joint domain adaptation for cross-subject and cross-session emotion recognition from electroencephalography Liang, Shengjin Su, Lei Fu, Yunfa Wu, Liping Front Hum Neurosci Neuroscience As an important component to promote the development of affective brain–computer interfaces, the study of emotion recognition based on electroencephalography (EEG) has encountered a difficult challenge; the distribution of EEG data changes among different subjects and at different time periods. Domain adaptation methods can effectively alleviate the generalization problem of EEG emotion recognition models. However, most of them treat multiple source domains, with significantly different distributions, as one single source domain, and only adapt the cross-domain marginal distribution while ignoring the joint distribution difference between the domains. To gain the advantages of multiple source distributions, and better match the distributions of the source and target domains, this paper proposes a novel multi-source joint domain adaptation (MSJDA) network. We first map all domains to a shared feature space and then align the joint distributions of the further extracted private representations and the corresponding classification predictions for each pair of source and target domains. Extensive cross-subject and cross-session experiments on the benchmark dataset, SEED, demonstrate the effectiveness of the proposed model, where more significant classification results are obtained on the more difficult cross-subject emotion recognition task. Frontiers Media S.A. 2022-09-15 /pmc/articles/PMC9520599/ /pubmed/36188181 http://dx.doi.org/10.3389/fnhum.2022.921346 Text en Copyright © 2022 Liang, Su, Fu and Wu. https://creativecommons.org/licenses/by/4.0/This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
spellingShingle Neuroscience
Liang, Shengjin
Su, Lei
Fu, Yunfa
Wu, Liping
Multi-source joint domain adaptation for cross-subject and cross-session emotion recognition from electroencephalography
title Multi-source joint domain adaptation for cross-subject and cross-session emotion recognition from electroencephalography
title_full Multi-source joint domain adaptation for cross-subject and cross-session emotion recognition from electroencephalography
title_fullStr Multi-source joint domain adaptation for cross-subject and cross-session emotion recognition from electroencephalography
title_full_unstemmed Multi-source joint domain adaptation for cross-subject and cross-session emotion recognition from electroencephalography
title_short Multi-source joint domain adaptation for cross-subject and cross-session emotion recognition from electroencephalography
title_sort multi-source joint domain adaptation for cross-subject and cross-session emotion recognition from electroencephalography
topic Neuroscience
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9520599/
https://www.ncbi.nlm.nih.gov/pubmed/36188181
http://dx.doi.org/10.3389/fnhum.2022.921346
work_keys_str_mv AT liangshengjin multisourcejointdomainadaptationforcrosssubjectandcrosssessionemotionrecognitionfromelectroencephalography
AT sulei multisourcejointdomainadaptationforcrosssubjectandcrosssessionemotionrecognitionfromelectroencephalography
AT fuyunfa multisourcejointdomainadaptationforcrosssubjectandcrosssessionemotionrecognitionfromelectroencephalography
AT wuliping multisourcejointdomainadaptationforcrosssubjectandcrosssessionemotionrecognitionfromelectroencephalography