Cargando…

Human and machine validation of 14 databases of dynamic facial expressions

With a shift in interest toward dynamic expressions, numerous corpora of dynamic facial stimuli have been developed over the past two decades. The present research aimed to test existing sets of dynamic facial expressions (published between 2000 and 2015) in a cross-corpus validation effort. For thi...

Descripción completa

Detalles Bibliográficos
Autores principales: Krumhuber, Eva G., Küster, Dennis, Namba, Shushi, Skora, Lina
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Springer US 2020
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8062366/
https://www.ncbi.nlm.nih.gov/pubmed/32804342
http://dx.doi.org/10.3758/s13428-020-01443-y
_version_ 1783681747670532096
author Krumhuber, Eva G.
Küster, Dennis
Namba, Shushi
Skora, Lina
author_facet Krumhuber, Eva G.
Küster, Dennis
Namba, Shushi
Skora, Lina
author_sort Krumhuber, Eva G.
collection PubMed
description With a shift in interest toward dynamic expressions, numerous corpora of dynamic facial stimuli have been developed over the past two decades. The present research aimed to test existing sets of dynamic facial expressions (published between 2000 and 2015) in a cross-corpus validation effort. For this, 14 dynamic databases were selected that featured facial expressions of the basic six emotions (anger, disgust, fear, happiness, sadness, surprise) in posed or spontaneous form. In Study 1, a subset of stimuli from each database (N = 162) were presented to human observers and machine analysis, yielding considerable variance in emotion recognition performance across the databases. Classification accuracy further varied with perceived intensity and naturalness of the displays, with posed expressions being judged more accurately and as intense, but less natural compared to spontaneous ones. Study 2 aimed for a full validation of the 14 databases by subjecting the entire stimulus set (N = 3812) to machine analysis. A FACS-based Action Unit (AU) analysis revealed that facial AU configurations were more prototypical in posed than spontaneous expressions. The prototypicality of an expression in turn predicted emotion classification accuracy, with higher performance observed for more prototypical facial behavior. Furthermore, technical features of each database (i.e., duration, face box size, head rotation, and motion) had a significant impact on recognition accuracy. Together, the findings suggest that existing databases vary in their ability to signal specific emotions, thereby facing a trade-off between realism and ecological validity on the one end, and expression uniformity and comparability on the other. ELECTRONIC SUPPLEMENTARY MATERIAL: The online version of this article (10.3758/s13428-020-01443-y) contains supplementary material, which is available to authorized users.
format Online
Article
Text
id pubmed-8062366
institution National Center for Biotechnology Information
language English
publishDate 2020
publisher Springer US
record_format MEDLINE/PubMed
spelling pubmed-80623662021-05-05 Human and machine validation of 14 databases of dynamic facial expressions Krumhuber, Eva G. Küster, Dennis Namba, Shushi Skora, Lina Behav Res Methods Article With a shift in interest toward dynamic expressions, numerous corpora of dynamic facial stimuli have been developed over the past two decades. The present research aimed to test existing sets of dynamic facial expressions (published between 2000 and 2015) in a cross-corpus validation effort. For this, 14 dynamic databases were selected that featured facial expressions of the basic six emotions (anger, disgust, fear, happiness, sadness, surprise) in posed or spontaneous form. In Study 1, a subset of stimuli from each database (N = 162) were presented to human observers and machine analysis, yielding considerable variance in emotion recognition performance across the databases. Classification accuracy further varied with perceived intensity and naturalness of the displays, with posed expressions being judged more accurately and as intense, but less natural compared to spontaneous ones. Study 2 aimed for a full validation of the 14 databases by subjecting the entire stimulus set (N = 3812) to machine analysis. A FACS-based Action Unit (AU) analysis revealed that facial AU configurations were more prototypical in posed than spontaneous expressions. The prototypicality of an expression in turn predicted emotion classification accuracy, with higher performance observed for more prototypical facial behavior. Furthermore, technical features of each database (i.e., duration, face box size, head rotation, and motion) had a significant impact on recognition accuracy. Together, the findings suggest that existing databases vary in their ability to signal specific emotions, thereby facing a trade-off between realism and ecological validity on the one end, and expression uniformity and comparability on the other. ELECTRONIC SUPPLEMENTARY MATERIAL: The online version of this article (10.3758/s13428-020-01443-y) contains supplementary material, which is available to authorized users. Springer US 2020-08-17 2021 /pmc/articles/PMC8062366/ /pubmed/32804342 http://dx.doi.org/10.3758/s13428-020-01443-y Text en © The Author(s) 2020 https://creativecommons.org/licenses/by/4.0/Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) .
spellingShingle Article
Krumhuber, Eva G.
Küster, Dennis
Namba, Shushi
Skora, Lina
Human and machine validation of 14 databases of dynamic facial expressions
title Human and machine validation of 14 databases of dynamic facial expressions
title_full Human and machine validation of 14 databases of dynamic facial expressions
title_fullStr Human and machine validation of 14 databases of dynamic facial expressions
title_full_unstemmed Human and machine validation of 14 databases of dynamic facial expressions
title_short Human and machine validation of 14 databases of dynamic facial expressions
title_sort human and machine validation of 14 databases of dynamic facial expressions
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8062366/
https://www.ncbi.nlm.nih.gov/pubmed/32804342
http://dx.doi.org/10.3758/s13428-020-01443-y
work_keys_str_mv AT krumhuberevag humanandmachinevalidationof14databasesofdynamicfacialexpressions
AT kusterdennis humanandmachinevalidationof14databasesofdynamicfacialexpressions
AT nambashushi humanandmachinevalidationof14databasesofdynamicfacialexpressions
AT skoralina humanandmachinevalidationof14databasesofdynamicfacialexpressions