Cargando…

Adaptive user interface design and analysis using emotion recognition through facial expressions and body posture from an RGB-D sensor

This work presents the design and analysis of an Adaptive User Interface (AUI) for a desktop application that uses a novel solution for the recognition of the emotional state of a user through both facial expressions and body posture from an RGB-D sensor. Six basic emotions are recognized through fa...

Descripción completa

Detalles Bibliográficos
Autores principales: Medjden, Selma, Ahmed, Naveed, Lataifeh, Mohammed
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Public Library of Science 2020
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7365406/
https://www.ncbi.nlm.nih.gov/pubmed/32673325
http://dx.doi.org/10.1371/journal.pone.0235908
_version_ 1783560027203698688
author Medjden, Selma
Ahmed, Naveed
Lataifeh, Mohammed
author_facet Medjden, Selma
Ahmed, Naveed
Lataifeh, Mohammed
author_sort Medjden, Selma
collection PubMed
description This work presents the design and analysis of an Adaptive User Interface (AUI) for a desktop application that uses a novel solution for the recognition of the emotional state of a user through both facial expressions and body posture from an RGB-D sensor. Six basic emotions are recognized through facial expressions in addition to the physiological state, which is recognized through the body posture. The facial expressions and body posture are acquired in real-time from a Kinect sensor. A scoring system is used to improve recognition by minimizing the confusion between the different emotions. The implemented solution achieves an accuracy rate of above 90%. The recognized emotion is then used to derive an Automatic AUI where the user can use speech commands to modify the User Interface (UI) automatically. A comprehensive user study is performed to compare the usability of an Automatic, Manual, and a Hybrid AUI. The AUIs are evaluated in terms of their efficiency, effectiveness, productivity, and error safety. Additionally, a comprehensive analysis is performed to evaluate the results from the viewpoint of different genders and age groups. Results show that the hybrid adaptation improves usability in terms of productivity and efficiency. Finally, a combination of both automatic and hybrid AUIs result in significantly positive user experience compared to the manual adaptation.
format Online
Article
Text
id pubmed-7365406
institution National Center for Biotechnology Information
language English
publishDate 2020
publisher Public Library of Science
record_format MEDLINE/PubMed
spelling pubmed-73654062020-07-27 Adaptive user interface design and analysis using emotion recognition through facial expressions and body posture from an RGB-D sensor Medjden, Selma Ahmed, Naveed Lataifeh, Mohammed PLoS One Research Article This work presents the design and analysis of an Adaptive User Interface (AUI) for a desktop application that uses a novel solution for the recognition of the emotional state of a user through both facial expressions and body posture from an RGB-D sensor. Six basic emotions are recognized through facial expressions in addition to the physiological state, which is recognized through the body posture. The facial expressions and body posture are acquired in real-time from a Kinect sensor. A scoring system is used to improve recognition by minimizing the confusion between the different emotions. The implemented solution achieves an accuracy rate of above 90%. The recognized emotion is then used to derive an Automatic AUI where the user can use speech commands to modify the User Interface (UI) automatically. A comprehensive user study is performed to compare the usability of an Automatic, Manual, and a Hybrid AUI. The AUIs are evaluated in terms of their efficiency, effectiveness, productivity, and error safety. Additionally, a comprehensive analysis is performed to evaluate the results from the viewpoint of different genders and age groups. Results show that the hybrid adaptation improves usability in terms of productivity and efficiency. Finally, a combination of both automatic and hybrid AUIs result in significantly positive user experience compared to the manual adaptation. Public Library of Science 2020-07-16 /pmc/articles/PMC7365406/ /pubmed/32673325 http://dx.doi.org/10.1371/journal.pone.0235908 Text en © 2020 Medjden et al http://creativecommons.org/licenses/by/4.0/ This is an open access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0/) , which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
spellingShingle Research Article
Medjden, Selma
Ahmed, Naveed
Lataifeh, Mohammed
Adaptive user interface design and analysis using emotion recognition through facial expressions and body posture from an RGB-D sensor
title Adaptive user interface design and analysis using emotion recognition through facial expressions and body posture from an RGB-D sensor
title_full Adaptive user interface design and analysis using emotion recognition through facial expressions and body posture from an RGB-D sensor
title_fullStr Adaptive user interface design and analysis using emotion recognition through facial expressions and body posture from an RGB-D sensor
title_full_unstemmed Adaptive user interface design and analysis using emotion recognition through facial expressions and body posture from an RGB-D sensor
title_short Adaptive user interface design and analysis using emotion recognition through facial expressions and body posture from an RGB-D sensor
title_sort adaptive user interface design and analysis using emotion recognition through facial expressions and body posture from an rgb-d sensor
topic Research Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7365406/
https://www.ncbi.nlm.nih.gov/pubmed/32673325
http://dx.doi.org/10.1371/journal.pone.0235908
work_keys_str_mv AT medjdenselma adaptiveuserinterfacedesignandanalysisusingemotionrecognitionthroughfacialexpressionsandbodyposturefromanrgbdsensor
AT ahmednaveed adaptiveuserinterfacedesignandanalysisusingemotionrecognitionthroughfacialexpressionsandbodyposturefromanrgbdsensor
AT lataifehmohammed adaptiveuserinterfacedesignandanalysisusingemotionrecognitionthroughfacialexpressionsandbodyposturefromanrgbdsensor