Cargando…

DRER: Deep Learning–Based Driver’s Real Emotion Recognizer

In intelligent vehicles, it is essential to monitor the driver’s condition; however, recognizing the driver’s emotional state is one of the most challenging and important tasks. Most previous studies focused on facial expression recognition to monitor the driver’s emotional state. However, while dri...

Descripción completa

Detalles Bibliográficos
Autores principales: Oh, Geesung, Ryu, Junghwan, Jeong, Euiseok, Yang, Ji Hyun, Hwang, Sungwook, Lee, Sangho, Lim, Sejoon
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2021
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8003797/
https://www.ncbi.nlm.nih.gov/pubmed/33808922
http://dx.doi.org/10.3390/s21062166
_version_ 1783671773937532928
author Oh, Geesung
Ryu, Junghwan
Jeong, Euiseok
Yang, Ji Hyun
Hwang, Sungwook
Lee, Sangho
Lim, Sejoon
author_facet Oh, Geesung
Ryu, Junghwan
Jeong, Euiseok
Yang, Ji Hyun
Hwang, Sungwook
Lee, Sangho
Lim, Sejoon
author_sort Oh, Geesung
collection PubMed
description In intelligent vehicles, it is essential to monitor the driver’s condition; however, recognizing the driver’s emotional state is one of the most challenging and important tasks. Most previous studies focused on facial expression recognition to monitor the driver’s emotional state. However, while driving, many factors are preventing the drivers from revealing the emotions on their faces. To address this problem, we propose a deep learning-based driver’s real emotion recognizer (DRER), which is a deep learning-based algorithm to recognize the drivers’ real emotions that cannot be completely identified based on their facial expressions. The proposed algorithm comprises of two models: (i) facial expression recognition model, which refers to the state-of-the-art convolutional neural network structure; and (ii) sensor fusion emotion recognition model, which fuses the recognized state of facial expressions with electrodermal activity, a bio-physiological signal representing electrical characteristics of the skin, in recognizing even the driver’s real emotional state. Hence, we categorized the driver’s emotion and conducted human-in-the-loop experiments to acquire the data. Experimental results show that the proposed fusing approach achieves 114% increase in accuracy compared to using only the facial expressions and 146% increase in accuracy compare to using only the electrodermal activity. In conclusion, our proposed method achieves 86.8% recognition accuracy in recognizing the driver’s induced emotion while driving situation.
format Online
Article
Text
id pubmed-8003797
institution National Center for Biotechnology Information
language English
publishDate 2021
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-80037972021-03-28 DRER: Deep Learning–Based Driver’s Real Emotion Recognizer Oh, Geesung Ryu, Junghwan Jeong, Euiseok Yang, Ji Hyun Hwang, Sungwook Lee, Sangho Lim, Sejoon Sensors (Basel) Article In intelligent vehicles, it is essential to monitor the driver’s condition; however, recognizing the driver’s emotional state is one of the most challenging and important tasks. Most previous studies focused on facial expression recognition to monitor the driver’s emotional state. However, while driving, many factors are preventing the drivers from revealing the emotions on their faces. To address this problem, we propose a deep learning-based driver’s real emotion recognizer (DRER), which is a deep learning-based algorithm to recognize the drivers’ real emotions that cannot be completely identified based on their facial expressions. The proposed algorithm comprises of two models: (i) facial expression recognition model, which refers to the state-of-the-art convolutional neural network structure; and (ii) sensor fusion emotion recognition model, which fuses the recognized state of facial expressions with electrodermal activity, a bio-physiological signal representing electrical characteristics of the skin, in recognizing even the driver’s real emotional state. Hence, we categorized the driver’s emotion and conducted human-in-the-loop experiments to acquire the data. Experimental results show that the proposed fusing approach achieves 114% increase in accuracy compared to using only the facial expressions and 146% increase in accuracy compare to using only the electrodermal activity. In conclusion, our proposed method achieves 86.8% recognition accuracy in recognizing the driver’s induced emotion while driving situation. MDPI 2021-03-19 /pmc/articles/PMC8003797/ /pubmed/33808922 http://dx.doi.org/10.3390/s21062166 Text en © 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Oh, Geesung
Ryu, Junghwan
Jeong, Euiseok
Yang, Ji Hyun
Hwang, Sungwook
Lee, Sangho
Lim, Sejoon
DRER: Deep Learning–Based Driver’s Real Emotion Recognizer
title DRER: Deep Learning–Based Driver’s Real Emotion Recognizer
title_full DRER: Deep Learning–Based Driver’s Real Emotion Recognizer
title_fullStr DRER: Deep Learning–Based Driver’s Real Emotion Recognizer
title_full_unstemmed DRER: Deep Learning–Based Driver’s Real Emotion Recognizer
title_short DRER: Deep Learning–Based Driver’s Real Emotion Recognizer
title_sort drer: deep learning–based driver’s real emotion recognizer
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8003797/
https://www.ncbi.nlm.nih.gov/pubmed/33808922
http://dx.doi.org/10.3390/s21062166
work_keys_str_mv AT ohgeesung drerdeeplearningbaseddriversrealemotionrecognizer
AT ryujunghwan drerdeeplearningbaseddriversrealemotionrecognizer
AT jeongeuiseok drerdeeplearningbaseddriversrealemotionrecognizer
AT yangjihyun drerdeeplearningbaseddriversrealemotionrecognizer
AT hwangsungwook drerdeeplearningbaseddriversrealemotionrecognizer
AT leesangho drerdeeplearningbaseddriversrealemotionrecognizer
AT limsejoon drerdeeplearningbaseddriversrealemotionrecognizer