Cargando…

Validation of dynamic virtual faces for facial affect recognition

The ability to recognise facial emotions is essential for successful social interaction. The most common stimuli used when evaluating this ability are photographs. Although these stimuli have proved to be valid, they do not offer the level of realism that virtual humans have achieved. The objective...

Descripción completa

Detalles Bibliográficos
Autores principales: Fernández-Sotos, Patricia, García, Arturo S., Vicente-Querol, Miguel A., Lahera, Guillermo, Rodriguez-Jimenez, Roberto, Fernández-Caballero, Antonio
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Public Library of Science 2021
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7833130/
https://www.ncbi.nlm.nih.gov/pubmed/33493234
http://dx.doi.org/10.1371/journal.pone.0246001
_version_ 1783641993834921984
author Fernández-Sotos, Patricia
García, Arturo S.
Vicente-Querol, Miguel A.
Lahera, Guillermo
Rodriguez-Jimenez, Roberto
Fernández-Caballero, Antonio
author_facet Fernández-Sotos, Patricia
García, Arturo S.
Vicente-Querol, Miguel A.
Lahera, Guillermo
Rodriguez-Jimenez, Roberto
Fernández-Caballero, Antonio
author_sort Fernández-Sotos, Patricia
collection PubMed
description The ability to recognise facial emotions is essential for successful social interaction. The most common stimuli used when evaluating this ability are photographs. Although these stimuli have proved to be valid, they do not offer the level of realism that virtual humans have achieved. The objective of the present paper is the validation of a new set of dynamic virtual faces (DVFs) that mimic the six basic emotions plus the neutral expression. The faces are prepared to be observed with low and high dynamism, and from front and side views. For this purpose, 204 healthy participants, stratified by gender, age and education level, were recruited for assessing their facial affect recognition with the set of DVFs. The accuracy in responses was compared with the already validated Penn Emotion Recognition Test (ER-40). The results showed that DVFs were as valid as standardised natural faces for accurately recreating human-like facial expressions. The overall accuracy in the identification of emotions was higher for the DVFs (88.25%) than for the ER-40 faces (82.60%). The percentage of hits of each DVF emotion was high, especially for neutral expression and happiness emotion. No statistically significant differences were discovered regarding gender. Nor were significant differences found between younger adults and adults over 60 years. Moreover, there is an increase of hits for avatar faces showing a greater dynamism, as well as front views of the DVFs compared to their profile presentations. DVFs are as valid as standardised natural faces for accurately recreating human-like facial expressions of emotions.
format Online
Article
Text
id pubmed-7833130
institution National Center for Biotechnology Information
language English
publishDate 2021
publisher Public Library of Science
record_format MEDLINE/PubMed
spelling pubmed-78331302021-01-26 Validation of dynamic virtual faces for facial affect recognition Fernández-Sotos, Patricia García, Arturo S. Vicente-Querol, Miguel A. Lahera, Guillermo Rodriguez-Jimenez, Roberto Fernández-Caballero, Antonio PLoS One Research Article The ability to recognise facial emotions is essential for successful social interaction. The most common stimuli used when evaluating this ability are photographs. Although these stimuli have proved to be valid, they do not offer the level of realism that virtual humans have achieved. The objective of the present paper is the validation of a new set of dynamic virtual faces (DVFs) that mimic the six basic emotions plus the neutral expression. The faces are prepared to be observed with low and high dynamism, and from front and side views. For this purpose, 204 healthy participants, stratified by gender, age and education level, were recruited for assessing their facial affect recognition with the set of DVFs. The accuracy in responses was compared with the already validated Penn Emotion Recognition Test (ER-40). The results showed that DVFs were as valid as standardised natural faces for accurately recreating human-like facial expressions. The overall accuracy in the identification of emotions was higher for the DVFs (88.25%) than for the ER-40 faces (82.60%). The percentage of hits of each DVF emotion was high, especially for neutral expression and happiness emotion. No statistically significant differences were discovered regarding gender. Nor were significant differences found between younger adults and adults over 60 years. Moreover, there is an increase of hits for avatar faces showing a greater dynamism, as well as front views of the DVFs compared to their profile presentations. DVFs are as valid as standardised natural faces for accurately recreating human-like facial expressions of emotions. Public Library of Science 2021-01-25 /pmc/articles/PMC7833130/ /pubmed/33493234 http://dx.doi.org/10.1371/journal.pone.0246001 Text en © 2021 Fernández-Sotos et al http://creativecommons.org/licenses/by/4.0/ This is an open access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0/) , which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
spellingShingle Research Article
Fernández-Sotos, Patricia
García, Arturo S.
Vicente-Querol, Miguel A.
Lahera, Guillermo
Rodriguez-Jimenez, Roberto
Fernández-Caballero, Antonio
Validation of dynamic virtual faces for facial affect recognition
title Validation of dynamic virtual faces for facial affect recognition
title_full Validation of dynamic virtual faces for facial affect recognition
title_fullStr Validation of dynamic virtual faces for facial affect recognition
title_full_unstemmed Validation of dynamic virtual faces for facial affect recognition
title_short Validation of dynamic virtual faces for facial affect recognition
title_sort validation of dynamic virtual faces for facial affect recognition
topic Research Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7833130/
https://www.ncbi.nlm.nih.gov/pubmed/33493234
http://dx.doi.org/10.1371/journal.pone.0246001
work_keys_str_mv AT fernandezsotospatricia validationofdynamicvirtualfacesforfacialaffectrecognition
AT garciaarturos validationofdynamicvirtualfacesforfacialaffectrecognition
AT vicentequerolmiguela validationofdynamicvirtualfacesforfacialaffectrecognition
AT laheraguillermo validationofdynamicvirtualfacesforfacialaffectrecognition
AT rodriguezjimenezroberto validationofdynamicvirtualfacesforfacialaffectrecognition
AT fernandezcaballeroantonio validationofdynamicvirtualfacesforfacialaffectrecognition