Cargando…
Towards Context-Aware Facial Emotion Reaction Database for Dyadic Interaction Settings
Emotion recognition is a significant issue in many sectors that use human emotion reactions as communication for marketing, technological equipment, or human–robot interaction. The realistic facial behavior of social robots and artificial agents is still a challenge, limiting their emotional credibi...
Autores principales: | , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
MDPI
2023
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9824663/ https://www.ncbi.nlm.nih.gov/pubmed/36617055 http://dx.doi.org/10.3390/s23010458 |
_version_ | 1784866465423294464 |
---|---|
author | Sham, Abdallah Hussein Khan, Amna Lamas, David Tikka, Pia Anbarjafari, Gholamreza |
author_facet | Sham, Abdallah Hussein Khan, Amna Lamas, David Tikka, Pia Anbarjafari, Gholamreza |
author_sort | Sham, Abdallah Hussein |
collection | PubMed |
description | Emotion recognition is a significant issue in many sectors that use human emotion reactions as communication for marketing, technological equipment, or human–robot interaction. The realistic facial behavior of social robots and artificial agents is still a challenge, limiting their emotional credibility in dyadic face-to-face situations with humans. One obstacle is the lack of appropriate training data on how humans typically interact in such settings. This article focused on collecting the facial behavior of 60 participants to create a new type of dyadic emotion reaction database. For this purpose, we propose a methodology that automatically captures the facial expressions of participants via webcam while they are engaged with other people (facial videos) in emotionally primed contexts. The data were then analyzed using three different Facial Expression Analysis (FEA) tools: iMotions, the Mini-Xception model, and the Py-Feat FEA toolkit. Although the emotion reactions were reported as genuine, the comparative analysis between the aforementioned models could not agree with a single emotion reaction prediction. Based on this result, a more-robust and -effective model for emotion reaction prediction is needed. The relevance of this work for human–computer interaction studies lies in its novel approach to developing adaptive behaviors for synthetic human-like beings (virtual or robotic), allowing them to simulate human facial interaction behavior in contextually varying dyadic situations with humans. This article should be useful for researchers using human emotion analysis while deciding on a suitable methodology to collect facial expression reactions in a dyadic setting. |
format | Online Article Text |
id | pubmed-9824663 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2023 |
publisher | MDPI |
record_format | MEDLINE/PubMed |
spelling | pubmed-98246632023-01-08 Towards Context-Aware Facial Emotion Reaction Database for Dyadic Interaction Settings Sham, Abdallah Hussein Khan, Amna Lamas, David Tikka, Pia Anbarjafari, Gholamreza Sensors (Basel) Article Emotion recognition is a significant issue in many sectors that use human emotion reactions as communication for marketing, technological equipment, or human–robot interaction. The realistic facial behavior of social robots and artificial agents is still a challenge, limiting their emotional credibility in dyadic face-to-face situations with humans. One obstacle is the lack of appropriate training data on how humans typically interact in such settings. This article focused on collecting the facial behavior of 60 participants to create a new type of dyadic emotion reaction database. For this purpose, we propose a methodology that automatically captures the facial expressions of participants via webcam while they are engaged with other people (facial videos) in emotionally primed contexts. The data were then analyzed using three different Facial Expression Analysis (FEA) tools: iMotions, the Mini-Xception model, and the Py-Feat FEA toolkit. Although the emotion reactions were reported as genuine, the comparative analysis between the aforementioned models could not agree with a single emotion reaction prediction. Based on this result, a more-robust and -effective model for emotion reaction prediction is needed. The relevance of this work for human–computer interaction studies lies in its novel approach to developing adaptive behaviors for synthetic human-like beings (virtual or robotic), allowing them to simulate human facial interaction behavior in contextually varying dyadic situations with humans. This article should be useful for researchers using human emotion analysis while deciding on a suitable methodology to collect facial expression reactions in a dyadic setting. MDPI 2023-01-01 /pmc/articles/PMC9824663/ /pubmed/36617055 http://dx.doi.org/10.3390/s23010458 Text en © 2023 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). |
spellingShingle | Article Sham, Abdallah Hussein Khan, Amna Lamas, David Tikka, Pia Anbarjafari, Gholamreza Towards Context-Aware Facial Emotion Reaction Database for Dyadic Interaction Settings |
title | Towards Context-Aware Facial Emotion Reaction Database for Dyadic Interaction Settings |
title_full | Towards Context-Aware Facial Emotion Reaction Database for Dyadic Interaction Settings |
title_fullStr | Towards Context-Aware Facial Emotion Reaction Database for Dyadic Interaction Settings |
title_full_unstemmed | Towards Context-Aware Facial Emotion Reaction Database for Dyadic Interaction Settings |
title_short | Towards Context-Aware Facial Emotion Reaction Database for Dyadic Interaction Settings |
title_sort | towards context-aware facial emotion reaction database for dyadic interaction settings |
topic | Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9824663/ https://www.ncbi.nlm.nih.gov/pubmed/36617055 http://dx.doi.org/10.3390/s23010458 |
work_keys_str_mv | AT shamabdallahhussein towardscontextawarefacialemotionreactiondatabasefordyadicinteractionsettings AT khanamna towardscontextawarefacialemotionreactiondatabasefordyadicinteractionsettings AT lamasdavid towardscontextawarefacialemotionreactiondatabasefordyadicinteractionsettings AT tikkapia towardscontextawarefacialemotionreactiondatabasefordyadicinteractionsettings AT anbarjafarigholamreza towardscontextawarefacialemotionreactiondatabasefordyadicinteractionsettings |