Cargando…

Development of Real-Time Landmark-Based Emotion Recognition CNN for Masked Faces

Owing to the availability of a wide range of emotion recognition applications in our lives, such as for mental status calculation, the demand for high-performance emotion recognition approaches remains uncertain. Nevertheless, the wearing of facial masks has been indispensable during the COVID-19 pa...

Descripción completa

Detalles Bibliográficos
Autores principales: Farkhod, Akhmedov, Abdusalomov, Akmalbek Bobomirzaevich, Mukhiddinov, Mukhriddin, Cho, Young-Im
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9698760/
https://www.ncbi.nlm.nih.gov/pubmed/36433303
http://dx.doi.org/10.3390/s22228704
_version_ 1784838900104036352
author Farkhod, Akhmedov
Abdusalomov, Akmalbek Bobomirzaevich
Mukhiddinov, Mukhriddin
Cho, Young-Im
author_facet Farkhod, Akhmedov
Abdusalomov, Akmalbek Bobomirzaevich
Mukhiddinov, Mukhriddin
Cho, Young-Im
author_sort Farkhod, Akhmedov
collection PubMed
description Owing to the availability of a wide range of emotion recognition applications in our lives, such as for mental status calculation, the demand for high-performance emotion recognition approaches remains uncertain. Nevertheless, the wearing of facial masks has been indispensable during the COVID-19 pandemic. In this study, we propose a graph-based emotion recognition method that adopts landmarks on the upper part of the face. Based on the proposed approach, several pre-processing steps were applied. After pre-processing, facial expression features need to be extracted from facial key points. The main steps of emotion recognition on masked faces include face detection by using Haar–Cascade, landmark implementation through a media-pipe face mesh model, and model training on seven emotional classes. The FER-2013 dataset was used for model training. An emotion detection model was developed for non-masked faces. Thereafter, landmarks were applied to the upper part of the face. After the detection of faces and landmark locations were extracted, we captured coordinates of emotional class landmarks and exported to a comma-separated values (csv) file. After that, model weights were transferred to the emotional classes. Finally, a landmark-based emotion recognition model for the upper facial parts was tested both on images and in real time using a web camera application. The results showed that the proposed model achieved an overall accuracy of 91.2% for seven emotional classes in the case of an image application. Image based emotion detection of the proposed model accuracy showed relatively higher results than the real-time emotion detection.
format Online
Article
Text
id pubmed-9698760
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-96987602022-11-26 Development of Real-Time Landmark-Based Emotion Recognition CNN for Masked Faces Farkhod, Akhmedov Abdusalomov, Akmalbek Bobomirzaevich Mukhiddinov, Mukhriddin Cho, Young-Im Sensors (Basel) Article Owing to the availability of a wide range of emotion recognition applications in our lives, such as for mental status calculation, the demand for high-performance emotion recognition approaches remains uncertain. Nevertheless, the wearing of facial masks has been indispensable during the COVID-19 pandemic. In this study, we propose a graph-based emotion recognition method that adopts landmarks on the upper part of the face. Based on the proposed approach, several pre-processing steps were applied. After pre-processing, facial expression features need to be extracted from facial key points. The main steps of emotion recognition on masked faces include face detection by using Haar–Cascade, landmark implementation through a media-pipe face mesh model, and model training on seven emotional classes. The FER-2013 dataset was used for model training. An emotion detection model was developed for non-masked faces. Thereafter, landmarks were applied to the upper part of the face. After the detection of faces and landmark locations were extracted, we captured coordinates of emotional class landmarks and exported to a comma-separated values (csv) file. After that, model weights were transferred to the emotional classes. Finally, a landmark-based emotion recognition model for the upper facial parts was tested both on images and in real time using a web camera application. The results showed that the proposed model achieved an overall accuracy of 91.2% for seven emotional classes in the case of an image application. Image based emotion detection of the proposed model accuracy showed relatively higher results than the real-time emotion detection. MDPI 2022-11-11 /pmc/articles/PMC9698760/ /pubmed/36433303 http://dx.doi.org/10.3390/s22228704 Text en © 2022 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Farkhod, Akhmedov
Abdusalomov, Akmalbek Bobomirzaevich
Mukhiddinov, Mukhriddin
Cho, Young-Im
Development of Real-Time Landmark-Based Emotion Recognition CNN for Masked Faces
title Development of Real-Time Landmark-Based Emotion Recognition CNN for Masked Faces
title_full Development of Real-Time Landmark-Based Emotion Recognition CNN for Masked Faces
title_fullStr Development of Real-Time Landmark-Based Emotion Recognition CNN for Masked Faces
title_full_unstemmed Development of Real-Time Landmark-Based Emotion Recognition CNN for Masked Faces
title_short Development of Real-Time Landmark-Based Emotion Recognition CNN for Masked Faces
title_sort development of real-time landmark-based emotion recognition cnn for masked faces
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9698760/
https://www.ncbi.nlm.nih.gov/pubmed/36433303
http://dx.doi.org/10.3390/s22228704
work_keys_str_mv AT farkhodakhmedov developmentofrealtimelandmarkbasedemotionrecognitioncnnformaskedfaces
AT abdusalomovakmalbekbobomirzaevich developmentofrealtimelandmarkbasedemotionrecognitioncnnformaskedfaces
AT mukhiddinovmukhriddin developmentofrealtimelandmarkbasedemotionrecognitioncnnformaskedfaces
AT choyoungim developmentofrealtimelandmarkbasedemotionrecognitioncnnformaskedfaces