Cargando…
Improving Human–Robot Interaction by Enhancing NAO Robot Awareness of Human Facial Expression
An intriguing challenge in the human–robot interaction field is the prospect of endowing robots with emotional intelligence to make the interaction more genuine, intuitive, and natural. A crucial aspect in achieving this goal is the robot’s capability to infer and interpret human emotions. Thanks to...
Autores principales: | , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
MDPI
2021
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8512606/ https://www.ncbi.nlm.nih.gov/pubmed/34640758 http://dx.doi.org/10.3390/s21196438 |
_version_ | 1784583035084079104 |
---|---|
author | Filippini, Chiara Perpetuini, David Cardone, Daniela Merla, Arcangelo |
author_facet | Filippini, Chiara Perpetuini, David Cardone, Daniela Merla, Arcangelo |
author_sort | Filippini, Chiara |
collection | PubMed |
description | An intriguing challenge in the human–robot interaction field is the prospect of endowing robots with emotional intelligence to make the interaction more genuine, intuitive, and natural. A crucial aspect in achieving this goal is the robot’s capability to infer and interpret human emotions. Thanks to its design and open programming platform, the NAO humanoid robot is one of the most widely used agents for human interaction. As with person-to-person communication, facial expressions are the privileged channel for recognizing the interlocutor’s emotional expressions. Although NAO is equipped with a facial expression recognition module, specific use cases may require additional features and affective computing capabilities that are not currently available. This study proposes a highly accurate convolutional-neural-network-based facial expression recognition model that is able to further enhance the NAO robot’ awareness of human facial expressions and provide the robot with an interlocutor’s arousal level detection capability. Indeed, the model tested during human–robot interactions was 91% and 90% accurate in recognizing happy and sad facial expressions, respectively; 75% accurate in recognizing surprised and scared expressions; and less accurate in recognizing neutral and angry expressions. Finally, the model was successfully integrated into the NAO SDK, thus allowing for high-performing facial expression classification with an inference time of 0.34 ± 0.04 s. |
format | Online Article Text |
id | pubmed-8512606 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2021 |
publisher | MDPI |
record_format | MEDLINE/PubMed |
spelling | pubmed-85126062021-10-14 Improving Human–Robot Interaction by Enhancing NAO Robot Awareness of Human Facial Expression Filippini, Chiara Perpetuini, David Cardone, Daniela Merla, Arcangelo Sensors (Basel) Article An intriguing challenge in the human–robot interaction field is the prospect of endowing robots with emotional intelligence to make the interaction more genuine, intuitive, and natural. A crucial aspect in achieving this goal is the robot’s capability to infer and interpret human emotions. Thanks to its design and open programming platform, the NAO humanoid robot is one of the most widely used agents for human interaction. As with person-to-person communication, facial expressions are the privileged channel for recognizing the interlocutor’s emotional expressions. Although NAO is equipped with a facial expression recognition module, specific use cases may require additional features and affective computing capabilities that are not currently available. This study proposes a highly accurate convolutional-neural-network-based facial expression recognition model that is able to further enhance the NAO robot’ awareness of human facial expressions and provide the robot with an interlocutor’s arousal level detection capability. Indeed, the model tested during human–robot interactions was 91% and 90% accurate in recognizing happy and sad facial expressions, respectively; 75% accurate in recognizing surprised and scared expressions; and less accurate in recognizing neutral and angry expressions. Finally, the model was successfully integrated into the NAO SDK, thus allowing for high-performing facial expression classification with an inference time of 0.34 ± 0.04 s. MDPI 2021-09-27 /pmc/articles/PMC8512606/ /pubmed/34640758 http://dx.doi.org/10.3390/s21196438 Text en © 2021 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). |
spellingShingle | Article Filippini, Chiara Perpetuini, David Cardone, Daniela Merla, Arcangelo Improving Human–Robot Interaction by Enhancing NAO Robot Awareness of Human Facial Expression |
title | Improving Human–Robot Interaction by Enhancing NAO Robot Awareness of Human Facial Expression |
title_full | Improving Human–Robot Interaction by Enhancing NAO Robot Awareness of Human Facial Expression |
title_fullStr | Improving Human–Robot Interaction by Enhancing NAO Robot Awareness of Human Facial Expression |
title_full_unstemmed | Improving Human–Robot Interaction by Enhancing NAO Robot Awareness of Human Facial Expression |
title_short | Improving Human–Robot Interaction by Enhancing NAO Robot Awareness of Human Facial Expression |
title_sort | improving human–robot interaction by enhancing nao robot awareness of human facial expression |
topic | Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8512606/ https://www.ncbi.nlm.nih.gov/pubmed/34640758 http://dx.doi.org/10.3390/s21196438 |
work_keys_str_mv | AT filippinichiara improvinghumanrobotinteractionbyenhancingnaorobotawarenessofhumanfacialexpression AT perpetuinidavid improvinghumanrobotinteractionbyenhancingnaorobotawarenessofhumanfacialexpression AT cardonedaniela improvinghumanrobotinteractionbyenhancingnaorobotawarenessofhumanfacialexpression AT merlaarcangelo improvinghumanrobotinteractionbyenhancingnaorobotawarenessofhumanfacialexpression |