Cargando…

Multimodal Approach for Emotion Recognition Based on Simulated Flight Experiments

The present work tries to fill part of the gap regarding the pilots’ emotions and their bio-reactions during some flight procedures such as, takeoff, climbing, cruising, descent, initial approach, final approach and landing. A sensing architecture and a set of experiments were developed, associating...

Descripción completa

Detalles Bibliográficos
Autores principales: César Cavalcanti Roza, Válber, Adrian Postolache, Octavian
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2019
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6960577/
https://www.ncbi.nlm.nih.gov/pubmed/31847210
http://dx.doi.org/10.3390/s19245516
_version_ 1783487802913062912
author César Cavalcanti Roza, Válber
Adrian Postolache, Octavian
author_facet César Cavalcanti Roza, Válber
Adrian Postolache, Octavian
author_sort César Cavalcanti Roza, Válber
collection PubMed
description The present work tries to fill part of the gap regarding the pilots’ emotions and their bio-reactions during some flight procedures such as, takeoff, climbing, cruising, descent, initial approach, final approach and landing. A sensing architecture and a set of experiments were developed, associating it to several simulated flights ([Formula: see text]) using the Microsoft Flight Simulator Steam Edition (FSX-SE). The approach was carried out with eight beginner users on the flight simulator ([Formula: see text]). It is shown that it is possible to recognize emotions from different pilots in flight, combining their present and previous emotions. The cardiac system based on Heart Rate (HR), Galvanic Skin Response (GSR) and Electroencephalography (EEG), were used to extract emotions, as well as the intensities of emotions detected from the pilot face. We also considered five main emotions: happy, sad, angry, surprise and scared. The emotion recognition is based on Artificial Neural Networks and Deep Learning techniques. The Root Mean Squared Error (RMSE) and Mean Absolute Error (MAE) were the main methods used to measure the quality of the regression output models. The tests of the produced output models showed that the lowest recognition errors were reached when all data were considered or when the GSR datasets were omitted from the model training. It also showed that the emotion surprised was the easiest to recognize, having a mean RMSE of 0.13 and mean MAE of 0.01; while the emotion sad was the hardest to recognize, having a mean RMSE of 0.82 and mean MAE of 0.08. When we considered only the higher emotion intensities by time, the most matches accuracies were between 55% and 100%.
format Online
Article
Text
id pubmed-6960577
institution National Center for Biotechnology Information
language English
publishDate 2019
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-69605772020-01-23 Multimodal Approach for Emotion Recognition Based on Simulated Flight Experiments César Cavalcanti Roza, Válber Adrian Postolache, Octavian Sensors (Basel) Article The present work tries to fill part of the gap regarding the pilots’ emotions and their bio-reactions during some flight procedures such as, takeoff, climbing, cruising, descent, initial approach, final approach and landing. A sensing architecture and a set of experiments were developed, associating it to several simulated flights ([Formula: see text]) using the Microsoft Flight Simulator Steam Edition (FSX-SE). The approach was carried out with eight beginner users on the flight simulator ([Formula: see text]). It is shown that it is possible to recognize emotions from different pilots in flight, combining their present and previous emotions. The cardiac system based on Heart Rate (HR), Galvanic Skin Response (GSR) and Electroencephalography (EEG), were used to extract emotions, as well as the intensities of emotions detected from the pilot face. We also considered five main emotions: happy, sad, angry, surprise and scared. The emotion recognition is based on Artificial Neural Networks and Deep Learning techniques. The Root Mean Squared Error (RMSE) and Mean Absolute Error (MAE) were the main methods used to measure the quality of the regression output models. The tests of the produced output models showed that the lowest recognition errors were reached when all data were considered or when the GSR datasets were omitted from the model training. It also showed that the emotion surprised was the easiest to recognize, having a mean RMSE of 0.13 and mean MAE of 0.01; while the emotion sad was the hardest to recognize, having a mean RMSE of 0.82 and mean MAE of 0.08. When we considered only the higher emotion intensities by time, the most matches accuracies were between 55% and 100%. MDPI 2019-12-13 /pmc/articles/PMC6960577/ /pubmed/31847210 http://dx.doi.org/10.3390/s19245516 Text en © 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
César Cavalcanti Roza, Válber
Adrian Postolache, Octavian
Multimodal Approach for Emotion Recognition Based on Simulated Flight Experiments
title Multimodal Approach for Emotion Recognition Based on Simulated Flight Experiments
title_full Multimodal Approach for Emotion Recognition Based on Simulated Flight Experiments
title_fullStr Multimodal Approach for Emotion Recognition Based on Simulated Flight Experiments
title_full_unstemmed Multimodal Approach for Emotion Recognition Based on Simulated Flight Experiments
title_short Multimodal Approach for Emotion Recognition Based on Simulated Flight Experiments
title_sort multimodal approach for emotion recognition based on simulated flight experiments
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6960577/
https://www.ncbi.nlm.nih.gov/pubmed/31847210
http://dx.doi.org/10.3390/s19245516
work_keys_str_mv AT cesarcavalcantirozavalber multimodalapproachforemotionrecognitionbasedonsimulatedflightexperiments
AT adrianpostolacheoctavian multimodalapproachforemotionrecognitionbasedonsimulatedflightexperiments