Cargando…

M1M2: Deep-Learning-Based Real-Time Emotion Recognition from Neural Activity

Emotion recognition, or the ability of computers to interpret people’s emotional states, is a very active research area with vast applications to improve people’s lives. However, most image-based emotion recognition techniques are flawed, as humans can intentionally hide their emotions by changing f...

Descripción completa

Detalles Bibliográficos
Autores principales: Akter, Sumya, Prodhan, Rumman Ahmed, Pias, Tanmoy Sarkar, Eisenberg, David, Fresneda Fernandez, Jorge
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9654596/
https://www.ncbi.nlm.nih.gov/pubmed/36366164
http://dx.doi.org/10.3390/s22218467
_version_ 1784828971667423232
author Akter, Sumya
Prodhan, Rumman Ahmed
Pias, Tanmoy Sarkar
Eisenberg, David
Fresneda Fernandez, Jorge
author_facet Akter, Sumya
Prodhan, Rumman Ahmed
Pias, Tanmoy Sarkar
Eisenberg, David
Fresneda Fernandez, Jorge
author_sort Akter, Sumya
collection PubMed
description Emotion recognition, or the ability of computers to interpret people’s emotional states, is a very active research area with vast applications to improve people’s lives. However, most image-based emotion recognition techniques are flawed, as humans can intentionally hide their emotions by changing facial expressions. Consequently, brain signals are being used to detect human emotions with improved accuracy, but most proposed systems demonstrate poor performance as EEG signals are difficult to classify using standard machine learning and deep learning techniques. This paper proposes two convolutional neural network (CNN) models (M1: heavily parameterized CNN model and M2: lightly parameterized CNN model) coupled with elegant feature extraction methods for effective recognition. In this study, the most popular EEG benchmark dataset, the DEAP, is utilized with two of its labels, valence, and arousal, for binary classification. We use Fast Fourier Transformation to extract the frequency domain features, convolutional layers for deep features, and complementary features to represent the dataset. The M1 and M2 CNN models achieve nearly perfect accuracy of 99.89% and 99.22%, respectively, which outperform every previous state-of-the-art model. We empirically demonstrate that the M2 model requires only 2 seconds of EEG signal for 99.22% accuracy, and it can achieve over 96% accuracy with only 125 milliseconds of EEG data for valence classification. Moreover, the proposed M2 model achieves 96.8% accuracy on valence using only 10% of the training dataset, demonstrating our proposed system’s effectiveness. Documented implementation codes for every experiment are published for reproducibility.
format Online
Article
Text
id pubmed-9654596
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-96545962022-11-15 M1M2: Deep-Learning-Based Real-Time Emotion Recognition from Neural Activity Akter, Sumya Prodhan, Rumman Ahmed Pias, Tanmoy Sarkar Eisenberg, David Fresneda Fernandez, Jorge Sensors (Basel) Article Emotion recognition, or the ability of computers to interpret people’s emotional states, is a very active research area with vast applications to improve people’s lives. However, most image-based emotion recognition techniques are flawed, as humans can intentionally hide their emotions by changing facial expressions. Consequently, brain signals are being used to detect human emotions with improved accuracy, but most proposed systems demonstrate poor performance as EEG signals are difficult to classify using standard machine learning and deep learning techniques. This paper proposes two convolutional neural network (CNN) models (M1: heavily parameterized CNN model and M2: lightly parameterized CNN model) coupled with elegant feature extraction methods for effective recognition. In this study, the most popular EEG benchmark dataset, the DEAP, is utilized with two of its labels, valence, and arousal, for binary classification. We use Fast Fourier Transformation to extract the frequency domain features, convolutional layers for deep features, and complementary features to represent the dataset. The M1 and M2 CNN models achieve nearly perfect accuracy of 99.89% and 99.22%, respectively, which outperform every previous state-of-the-art model. We empirically demonstrate that the M2 model requires only 2 seconds of EEG signal for 99.22% accuracy, and it can achieve over 96% accuracy with only 125 milliseconds of EEG data for valence classification. Moreover, the proposed M2 model achieves 96.8% accuracy on valence using only 10% of the training dataset, demonstrating our proposed system’s effectiveness. Documented implementation codes for every experiment are published for reproducibility. MDPI 2022-11-03 /pmc/articles/PMC9654596/ /pubmed/36366164 http://dx.doi.org/10.3390/s22218467 Text en © 2022 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Akter, Sumya
Prodhan, Rumman Ahmed
Pias, Tanmoy Sarkar
Eisenberg, David
Fresneda Fernandez, Jorge
M1M2: Deep-Learning-Based Real-Time Emotion Recognition from Neural Activity
title M1M2: Deep-Learning-Based Real-Time Emotion Recognition from Neural Activity
title_full M1M2: Deep-Learning-Based Real-Time Emotion Recognition from Neural Activity
title_fullStr M1M2: Deep-Learning-Based Real-Time Emotion Recognition from Neural Activity
title_full_unstemmed M1M2: Deep-Learning-Based Real-Time Emotion Recognition from Neural Activity
title_short M1M2: Deep-Learning-Based Real-Time Emotion Recognition from Neural Activity
title_sort m1m2: deep-learning-based real-time emotion recognition from neural activity
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9654596/
https://www.ncbi.nlm.nih.gov/pubmed/36366164
http://dx.doi.org/10.3390/s22218467
work_keys_str_mv AT aktersumya m1m2deeplearningbasedrealtimeemotionrecognitionfromneuralactivity
AT prodhanrummanahmed m1m2deeplearningbasedrealtimeemotionrecognitionfromneuralactivity
AT piastanmoysarkar m1m2deeplearningbasedrealtimeemotionrecognitionfromneuralactivity
AT eisenbergdavid m1m2deeplearningbasedrealtimeemotionrecognitionfromneuralactivity
AT fresnedafernandezjorge m1m2deeplearningbasedrealtimeemotionrecognitionfromneuralactivity