Cargando…

CNN and LSTM-Based Emotion Charting Using Physiological Signals

Novel trends in affective computing are based on reliable sources of physiological signals such as Electroencephalogram (EEG), Electrocardiogram (ECG), and Galvanic Skin Response (GSR). The use of these signals provides challenges of performance improvement within a broader set of emotion classes in...

Descripción completa

Detalles Bibliográficos
Autores principales: Dar, Muhammad Najam, Akram, Muhammad Usman, Khawaja, Sajid Gul, Pujari, Amit N.
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2020
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7472085/
https://www.ncbi.nlm.nih.gov/pubmed/32823807
http://dx.doi.org/10.3390/s20164551
_version_ 1783578907959623680
author Dar, Muhammad Najam
Akram, Muhammad Usman
Khawaja, Sajid Gul
Pujari, Amit N.
author_facet Dar, Muhammad Najam
Akram, Muhammad Usman
Khawaja, Sajid Gul
Pujari, Amit N.
author_sort Dar, Muhammad Najam
collection PubMed
description Novel trends in affective computing are based on reliable sources of physiological signals such as Electroencephalogram (EEG), Electrocardiogram (ECG), and Galvanic Skin Response (GSR). The use of these signals provides challenges of performance improvement within a broader set of emotion classes in a less constrained real-world environment. To overcome these challenges, we propose a computational framework of 2D Convolutional Neural Network (CNN) architecture for the arrangement of 14 channels of EEG, and a combination of Long Short-Term Memory (LSTM) and 1D-CNN architecture for ECG and GSR. Our approach is subject-independent and incorporates two publicly available datasets of DREAMER and AMIGOS with low-cost, wearable sensors to extract physiological signals suitable for real-world environments. The results outperform state-of-the-art approaches for classification into four classes, namely High Valence—High Arousal, High Valence—Low Arousal, Low Valence—High Arousal, and Low Valence—Low Arousal. Emotion elicitation average accuracy of [Formula: see text] is achieved with ECG right-channel modality, 76.65% with EEG modality, and 63.67% with GSR modality for AMIGOS. The overall highest accuracy of 99.0% for the AMIGOS dataset and 90.8% for the DREAMER dataset is achieved with multi-modal fusion. A strong correlation between spectral- and hidden-layer feature analysis with classification performance suggests the efficacy of the proposed method for significant feature extraction and higher emotion elicitation performance to a broader context for less constrained environments.
format Online
Article
Text
id pubmed-7472085
institution National Center for Biotechnology Information
language English
publishDate 2020
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-74720852020-09-04 CNN and LSTM-Based Emotion Charting Using Physiological Signals Dar, Muhammad Najam Akram, Muhammad Usman Khawaja, Sajid Gul Pujari, Amit N. Sensors (Basel) Article Novel trends in affective computing are based on reliable sources of physiological signals such as Electroencephalogram (EEG), Electrocardiogram (ECG), and Galvanic Skin Response (GSR). The use of these signals provides challenges of performance improvement within a broader set of emotion classes in a less constrained real-world environment. To overcome these challenges, we propose a computational framework of 2D Convolutional Neural Network (CNN) architecture for the arrangement of 14 channels of EEG, and a combination of Long Short-Term Memory (LSTM) and 1D-CNN architecture for ECG and GSR. Our approach is subject-independent and incorporates two publicly available datasets of DREAMER and AMIGOS with low-cost, wearable sensors to extract physiological signals suitable for real-world environments. The results outperform state-of-the-art approaches for classification into four classes, namely High Valence—High Arousal, High Valence—Low Arousal, Low Valence—High Arousal, and Low Valence—Low Arousal. Emotion elicitation average accuracy of [Formula: see text] is achieved with ECG right-channel modality, 76.65% with EEG modality, and 63.67% with GSR modality for AMIGOS. The overall highest accuracy of 99.0% for the AMIGOS dataset and 90.8% for the DREAMER dataset is achieved with multi-modal fusion. A strong correlation between spectral- and hidden-layer feature analysis with classification performance suggests the efficacy of the proposed method for significant feature extraction and higher emotion elicitation performance to a broader context for less constrained environments. MDPI 2020-08-14 /pmc/articles/PMC7472085/ /pubmed/32823807 http://dx.doi.org/10.3390/s20164551 Text en © 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Dar, Muhammad Najam
Akram, Muhammad Usman
Khawaja, Sajid Gul
Pujari, Amit N.
CNN and LSTM-Based Emotion Charting Using Physiological Signals
title CNN and LSTM-Based Emotion Charting Using Physiological Signals
title_full CNN and LSTM-Based Emotion Charting Using Physiological Signals
title_fullStr CNN and LSTM-Based Emotion Charting Using Physiological Signals
title_full_unstemmed CNN and LSTM-Based Emotion Charting Using Physiological Signals
title_short CNN and LSTM-Based Emotion Charting Using Physiological Signals
title_sort cnn and lstm-based emotion charting using physiological signals
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7472085/
https://www.ncbi.nlm.nih.gov/pubmed/32823807
http://dx.doi.org/10.3390/s20164551
work_keys_str_mv AT darmuhammadnajam cnnandlstmbasedemotionchartingusingphysiologicalsignals
AT akrammuhammadusman cnnandlstmbasedemotionchartingusingphysiologicalsignals
AT khawajasajidgul cnnandlstmbasedemotionchartingusingphysiologicalsignals
AT pujariamitn cnnandlstmbasedemotionchartingusingphysiologicalsignals