Cargando…
Hybrid transfer learning strategy for cross-subject EEG emotion recognition
Emotion recognition constitutes a pivotal research topic within affective computing, owing to its potential applications across various domains. Currently, emotion recognition methods based on deep learning frameworks utilizing electroencephalogram (EEG) signals have demonstrated effective applicati...
Autores principales: | , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Frontiers Media S.A.
2023
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10687359/ https://www.ncbi.nlm.nih.gov/pubmed/38034069 http://dx.doi.org/10.3389/fnhum.2023.1280241 |
_version_ | 1785151960818647040 |
---|---|
author | Lu, Wei Liu, Haiyan Ma, Hua Tan, Tien-Ping Xia, Lingnan |
author_facet | Lu, Wei Liu, Haiyan Ma, Hua Tan, Tien-Ping Xia, Lingnan |
author_sort | Lu, Wei |
collection | PubMed |
description | Emotion recognition constitutes a pivotal research topic within affective computing, owing to its potential applications across various domains. Currently, emotion recognition methods based on deep learning frameworks utilizing electroencephalogram (EEG) signals have demonstrated effective application and achieved impressive performance. However, in EEG-based emotion recognition, there exists a significant performance drop in cross-subject EEG Emotion recognition due to inter-individual differences among subjects. In order to address this challenge, a hybrid transfer learning strategy is proposed, and the Domain Adaptation with a Few-shot Fine-tuning Network (DFF-Net) is designed for cross-subject EEG emotion recognition. The first step involves the design of a domain adaptive learning module specialized for EEG emotion recognition, known as the Emo-DA module. Following this, the Emo-DA module is utilized to pre-train a model on both the source and target domains. Subsequently, fine-tuning is performed on the target domain specifically for the purpose of cross-subject EEG emotion recognition testing. This comprehensive approach effectively harnesses the attributes of domain adaptation and fine-tuning, resulting in a noteworthy improvement in the accuracy of the model for the challenging task of cross-subject EEG emotion recognition. The proposed DFF-Net surpasses the state-of-the-art methods in the cross-subject EEG emotion recognition task, achieving an average recognition accuracy of 93.37% on the SEED dataset and 82.32% on the SEED-IV dataset. |
format | Online Article Text |
id | pubmed-10687359 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2023 |
publisher | Frontiers Media S.A. |
record_format | MEDLINE/PubMed |
spelling | pubmed-106873592023-11-30 Hybrid transfer learning strategy for cross-subject EEG emotion recognition Lu, Wei Liu, Haiyan Ma, Hua Tan, Tien-Ping Xia, Lingnan Front Hum Neurosci Human Neuroscience Emotion recognition constitutes a pivotal research topic within affective computing, owing to its potential applications across various domains. Currently, emotion recognition methods based on deep learning frameworks utilizing electroencephalogram (EEG) signals have demonstrated effective application and achieved impressive performance. However, in EEG-based emotion recognition, there exists a significant performance drop in cross-subject EEG Emotion recognition due to inter-individual differences among subjects. In order to address this challenge, a hybrid transfer learning strategy is proposed, and the Domain Adaptation with a Few-shot Fine-tuning Network (DFF-Net) is designed for cross-subject EEG emotion recognition. The first step involves the design of a domain adaptive learning module specialized for EEG emotion recognition, known as the Emo-DA module. Following this, the Emo-DA module is utilized to pre-train a model on both the source and target domains. Subsequently, fine-tuning is performed on the target domain specifically for the purpose of cross-subject EEG emotion recognition testing. This comprehensive approach effectively harnesses the attributes of domain adaptation and fine-tuning, resulting in a noteworthy improvement in the accuracy of the model for the challenging task of cross-subject EEG emotion recognition. The proposed DFF-Net surpasses the state-of-the-art methods in the cross-subject EEG emotion recognition task, achieving an average recognition accuracy of 93.37% on the SEED dataset and 82.32% on the SEED-IV dataset. Frontiers Media S.A. 2023-11-16 /pmc/articles/PMC10687359/ /pubmed/38034069 http://dx.doi.org/10.3389/fnhum.2023.1280241 Text en Copyright © 2023 Lu, Liu, Ma, Tan and Xia. https://creativecommons.org/licenses/by/4.0/This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms. |
spellingShingle | Human Neuroscience Lu, Wei Liu, Haiyan Ma, Hua Tan, Tien-Ping Xia, Lingnan Hybrid transfer learning strategy for cross-subject EEG emotion recognition |
title | Hybrid transfer learning strategy for cross-subject EEG emotion recognition |
title_full | Hybrid transfer learning strategy for cross-subject EEG emotion recognition |
title_fullStr | Hybrid transfer learning strategy for cross-subject EEG emotion recognition |
title_full_unstemmed | Hybrid transfer learning strategy for cross-subject EEG emotion recognition |
title_short | Hybrid transfer learning strategy for cross-subject EEG emotion recognition |
title_sort | hybrid transfer learning strategy for cross-subject eeg emotion recognition |
topic | Human Neuroscience |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10687359/ https://www.ncbi.nlm.nih.gov/pubmed/38034069 http://dx.doi.org/10.3389/fnhum.2023.1280241 |
work_keys_str_mv | AT luwei hybridtransferlearningstrategyforcrosssubjecteegemotionrecognition AT liuhaiyan hybridtransferlearningstrategyforcrosssubjecteegemotionrecognition AT mahua hybridtransferlearningstrategyforcrosssubjecteegemotionrecognition AT tantienping hybridtransferlearningstrategyforcrosssubjecteegemotionrecognition AT xialingnan hybridtransferlearningstrategyforcrosssubjecteegemotionrecognition |