Cargando…

MindLink-Eumpy: An Open-Source Python Toolbox for Multimodal Emotion Recognition

Emotion recognition plays an important role in intelligent human–computer interaction, but the related research still faces the problems of low accuracy and subject dependence. In this paper, an open-source software toolbox called MindLink-Eumpy is developed to recognize emotions by integrating elec...

Descripción completa

Detalles Bibliográficos
Autores principales: Li, Ruixin, Liang, Yan, Liu, Xiaojian, Wang, Bingbing, Huang, Wenxin, Cai, Zhaoxin, Ye, Yaoguang, Qiu, Lina, Pan, Jiahui
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Frontiers Media S.A. 2021
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7933462/
https://www.ncbi.nlm.nih.gov/pubmed/33679348
http://dx.doi.org/10.3389/fnhum.2021.621493
_version_ 1783660616361181184
author Li, Ruixin
Liang, Yan
Liu, Xiaojian
Wang, Bingbing
Huang, Wenxin
Cai, Zhaoxin
Ye, Yaoguang
Qiu, Lina
Pan, Jiahui
author_facet Li, Ruixin
Liang, Yan
Liu, Xiaojian
Wang, Bingbing
Huang, Wenxin
Cai, Zhaoxin
Ye, Yaoguang
Qiu, Lina
Pan, Jiahui
author_sort Li, Ruixin
collection PubMed
description Emotion recognition plays an important role in intelligent human–computer interaction, but the related research still faces the problems of low accuracy and subject dependence. In this paper, an open-source software toolbox called MindLink-Eumpy is developed to recognize emotions by integrating electroencephalogram (EEG) and facial expression information. MindLink-Eumpy first applies a series of tools to automatically obtain physiological data from subjects and then analyzes the obtained facial expression data and EEG data, respectively, and finally fuses the two different signals at a decision level. In the detection of facial expressions, the algorithm used by MindLink-Eumpy is a multitask convolutional neural network (CNN) based on transfer learning technique. In the detection of EEG, MindLink-Eumpy provides two algorithms, including a subject-dependent model based on support vector machine (SVM) and a subject-independent model based on long short-term memory network (LSTM). In the decision-level fusion, weight enumerator and AdaBoost technique are applied to combine the predictions of SVM and CNN. We conducted two offline experiments on the Database for Emotion Analysis Using Physiological Signals (DEAP) dataset and the Multimodal Database for Affect Recognition and Implicit Tagging (MAHNOB-HCI) dataset, respectively, and conducted an online experiment on 15 healthy subjects. The results show that multimodal methods outperform single-modal methods in both offline and online experiments. In the subject-dependent condition, the multimodal method achieved an accuracy of 71.00% in the valence dimension and an accuracy of 72.14% in the arousal dimension. In the subject-independent condition, the LSTM-based method achieved an accuracy of 78.56% in the valence dimension and an accuracy of 77.22% in the arousal dimension. The feasibility and efficiency of MindLink-Eumpy for emotion recognition is thus demonstrated.
format Online
Article
Text
id pubmed-7933462
institution National Center for Biotechnology Information
language English
publishDate 2021
publisher Frontiers Media S.A.
record_format MEDLINE/PubMed
spelling pubmed-79334622021-03-06 MindLink-Eumpy: An Open-Source Python Toolbox for Multimodal Emotion Recognition Li, Ruixin Liang, Yan Liu, Xiaojian Wang, Bingbing Huang, Wenxin Cai, Zhaoxin Ye, Yaoguang Qiu, Lina Pan, Jiahui Front Hum Neurosci Neuroscience Emotion recognition plays an important role in intelligent human–computer interaction, but the related research still faces the problems of low accuracy and subject dependence. In this paper, an open-source software toolbox called MindLink-Eumpy is developed to recognize emotions by integrating electroencephalogram (EEG) and facial expression information. MindLink-Eumpy first applies a series of tools to automatically obtain physiological data from subjects and then analyzes the obtained facial expression data and EEG data, respectively, and finally fuses the two different signals at a decision level. In the detection of facial expressions, the algorithm used by MindLink-Eumpy is a multitask convolutional neural network (CNN) based on transfer learning technique. In the detection of EEG, MindLink-Eumpy provides two algorithms, including a subject-dependent model based on support vector machine (SVM) and a subject-independent model based on long short-term memory network (LSTM). In the decision-level fusion, weight enumerator and AdaBoost technique are applied to combine the predictions of SVM and CNN. We conducted two offline experiments on the Database for Emotion Analysis Using Physiological Signals (DEAP) dataset and the Multimodal Database for Affect Recognition and Implicit Tagging (MAHNOB-HCI) dataset, respectively, and conducted an online experiment on 15 healthy subjects. The results show that multimodal methods outperform single-modal methods in both offline and online experiments. In the subject-dependent condition, the multimodal method achieved an accuracy of 71.00% in the valence dimension and an accuracy of 72.14% in the arousal dimension. In the subject-independent condition, the LSTM-based method achieved an accuracy of 78.56% in the valence dimension and an accuracy of 77.22% in the arousal dimension. The feasibility and efficiency of MindLink-Eumpy for emotion recognition is thus demonstrated. Frontiers Media S.A. 2021-02-19 /pmc/articles/PMC7933462/ /pubmed/33679348 http://dx.doi.org/10.3389/fnhum.2021.621493 Text en Copyright © 2021 Li, Liang, Liu, Wang, Huang, Cai, Ye, Qiu and Pan. http://creativecommons.org/licenses/by/4.0/ This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
spellingShingle Neuroscience
Li, Ruixin
Liang, Yan
Liu, Xiaojian
Wang, Bingbing
Huang, Wenxin
Cai, Zhaoxin
Ye, Yaoguang
Qiu, Lina
Pan, Jiahui
MindLink-Eumpy: An Open-Source Python Toolbox for Multimodal Emotion Recognition
title MindLink-Eumpy: An Open-Source Python Toolbox for Multimodal Emotion Recognition
title_full MindLink-Eumpy: An Open-Source Python Toolbox for Multimodal Emotion Recognition
title_fullStr MindLink-Eumpy: An Open-Source Python Toolbox for Multimodal Emotion Recognition
title_full_unstemmed MindLink-Eumpy: An Open-Source Python Toolbox for Multimodal Emotion Recognition
title_short MindLink-Eumpy: An Open-Source Python Toolbox for Multimodal Emotion Recognition
title_sort mindlink-eumpy: an open-source python toolbox for multimodal emotion recognition
topic Neuroscience
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7933462/
https://www.ncbi.nlm.nih.gov/pubmed/33679348
http://dx.doi.org/10.3389/fnhum.2021.621493
work_keys_str_mv AT liruixin mindlinkeumpyanopensourcepythontoolboxformultimodalemotionrecognition
AT liangyan mindlinkeumpyanopensourcepythontoolboxformultimodalemotionrecognition
AT liuxiaojian mindlinkeumpyanopensourcepythontoolboxformultimodalemotionrecognition
AT wangbingbing mindlinkeumpyanopensourcepythontoolboxformultimodalemotionrecognition
AT huangwenxin mindlinkeumpyanopensourcepythontoolboxformultimodalemotionrecognition
AT caizhaoxin mindlinkeumpyanopensourcepythontoolboxformultimodalemotionrecognition
AT yeyaoguang mindlinkeumpyanopensourcepythontoolboxformultimodalemotionrecognition
AT qiulina mindlinkeumpyanopensourcepythontoolboxformultimodalemotionrecognition
AT panjiahui mindlinkeumpyanopensourcepythontoolboxformultimodalemotionrecognition