Cargando…

MS-MDA: Multisource Marginal Distribution Adaptation for Cross-Subject and Cross-Session EEG Emotion Recognition

As an essential element for the diagnosis and rehabilitation of psychiatric disorders, the electroencephalogram (EEG) based emotion recognition has achieved significant progress due to its high precision and reliability. However, one obstacle to practicality lies in the variability between subjects...

Descripción completa

Detalles Bibliográficos
Autores principales: Chen, Hao, Jin, Ming, Li, Zhunan, Fan, Cunhang, Li, Jinpeng, He, Huiguang
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Frontiers Media S.A. 2021
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8688841/
https://www.ncbi.nlm.nih.gov/pubmed/34949983
http://dx.doi.org/10.3389/fnins.2021.778488
_version_ 1784618432087457792
author Chen, Hao
Jin, Ming
Li, Zhunan
Fan, Cunhang
Li, Jinpeng
He, Huiguang
author_facet Chen, Hao
Jin, Ming
Li, Zhunan
Fan, Cunhang
Li, Jinpeng
He, Huiguang
author_sort Chen, Hao
collection PubMed
description As an essential element for the diagnosis and rehabilitation of psychiatric disorders, the electroencephalogram (EEG) based emotion recognition has achieved significant progress due to its high precision and reliability. However, one obstacle to practicality lies in the variability between subjects and sessions. Although several studies have adopted domain adaptation (DA) approaches to tackle this problem, most of them treat multiple EEG data from different subjects and sessions together as a single source domain for transfer, which either fails to satisfy the assumption of domain adaptation that the source has a certain marginal distribution, or increases the difficulty of adaptation. We therefore propose the multi-source marginal distribution adaptation (MS-MDA) for EEG emotion recognition, which takes both domain-invariant and domain-specific features into consideration. First, we assume that different EEG data share the same low-level features, then we construct independent branches for multiple EEG data source domains to adopt one-to-one domain adaptation and extract domain-specific features. Finally, the inference is made by multiple branches. We evaluate our method on SEED and SEED-IV for recognizing three and four emotions, respectively. Experimental results show that the MS-MDA outperforms the comparison methods and state-of-the-art models in cross-session and cross-subject transfer scenarios in our settings. Codes at https://github.com/VoiceBeer/MS-MDA.
format Online
Article
Text
id pubmed-8688841
institution National Center for Biotechnology Information
language English
publishDate 2021
publisher Frontiers Media S.A.
record_format MEDLINE/PubMed
spelling pubmed-86888412021-12-22 MS-MDA: Multisource Marginal Distribution Adaptation for Cross-Subject and Cross-Session EEG Emotion Recognition Chen, Hao Jin, Ming Li, Zhunan Fan, Cunhang Li, Jinpeng He, Huiguang Front Neurosci Neuroscience As an essential element for the diagnosis and rehabilitation of psychiatric disorders, the electroencephalogram (EEG) based emotion recognition has achieved significant progress due to its high precision and reliability. However, one obstacle to practicality lies in the variability between subjects and sessions. Although several studies have adopted domain adaptation (DA) approaches to tackle this problem, most of them treat multiple EEG data from different subjects and sessions together as a single source domain for transfer, which either fails to satisfy the assumption of domain adaptation that the source has a certain marginal distribution, or increases the difficulty of adaptation. We therefore propose the multi-source marginal distribution adaptation (MS-MDA) for EEG emotion recognition, which takes both domain-invariant and domain-specific features into consideration. First, we assume that different EEG data share the same low-level features, then we construct independent branches for multiple EEG data source domains to adopt one-to-one domain adaptation and extract domain-specific features. Finally, the inference is made by multiple branches. We evaluate our method on SEED and SEED-IV for recognizing three and four emotions, respectively. Experimental results show that the MS-MDA outperforms the comparison methods and state-of-the-art models in cross-session and cross-subject transfer scenarios in our settings. Codes at https://github.com/VoiceBeer/MS-MDA. Frontiers Media S.A. 2021-12-07 /pmc/articles/PMC8688841/ /pubmed/34949983 http://dx.doi.org/10.3389/fnins.2021.778488 Text en Copyright © 2021 Chen, Jin, Li, Fan, Li and He. https://creativecommons.org/licenses/by/4.0/This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
spellingShingle Neuroscience
Chen, Hao
Jin, Ming
Li, Zhunan
Fan, Cunhang
Li, Jinpeng
He, Huiguang
MS-MDA: Multisource Marginal Distribution Adaptation for Cross-Subject and Cross-Session EEG Emotion Recognition
title MS-MDA: Multisource Marginal Distribution Adaptation for Cross-Subject and Cross-Session EEG Emotion Recognition
title_full MS-MDA: Multisource Marginal Distribution Adaptation for Cross-Subject and Cross-Session EEG Emotion Recognition
title_fullStr MS-MDA: Multisource Marginal Distribution Adaptation for Cross-Subject and Cross-Session EEG Emotion Recognition
title_full_unstemmed MS-MDA: Multisource Marginal Distribution Adaptation for Cross-Subject and Cross-Session EEG Emotion Recognition
title_short MS-MDA: Multisource Marginal Distribution Adaptation for Cross-Subject and Cross-Session EEG Emotion Recognition
title_sort ms-mda: multisource marginal distribution adaptation for cross-subject and cross-session eeg emotion recognition
topic Neuroscience
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8688841/
https://www.ncbi.nlm.nih.gov/pubmed/34949983
http://dx.doi.org/10.3389/fnins.2021.778488
work_keys_str_mv AT chenhao msmdamultisourcemarginaldistributionadaptationforcrosssubjectandcrosssessioneegemotionrecognition
AT jinming msmdamultisourcemarginaldistributionadaptationforcrosssubjectandcrosssessioneegemotionrecognition
AT lizhunan msmdamultisourcemarginaldistributionadaptationforcrosssubjectandcrosssessioneegemotionrecognition
AT fancunhang msmdamultisourcemarginaldistributionadaptationforcrosssubjectandcrosssessioneegemotionrecognition
AT lijinpeng msmdamultisourcemarginaldistributionadaptationforcrosssubjectandcrosssessioneegemotionrecognition
AT hehuiguang msmdamultisourcemarginaldistributionadaptationforcrosssubjectandcrosssessioneegemotionrecognition