Cargando…

Virtual reality facial emotion recognition in social environments: An eye-tracking study

BACKGROUND: Virtual reality (VR) enables the administration of realistic and dynamic stimuli within a social context for the assessment and training of emotion recognition. We tested a novel VR emotion recognition task by comparing emotion recognition across a VR, video and photo task, investigating...

Descripción completa

Detalles Bibliográficos
Autores principales: Geraets, C.N.W., Klein Tuente, S., Lestestuiver, B.P., van Beilen, M., Nijman, S.A., Marsman, J.B.C., Veling, W.
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Elsevier 2021
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8350588/
https://www.ncbi.nlm.nih.gov/pubmed/34401391
http://dx.doi.org/10.1016/j.invent.2021.100432
_version_ 1783735797856337920
author Geraets, C.N.W.
Klein Tuente, S.
Lestestuiver, B.P.
van Beilen, M.
Nijman, S.A.
Marsman, J.B.C.
Veling, W.
author_facet Geraets, C.N.W.
Klein Tuente, S.
Lestestuiver, B.P.
van Beilen, M.
Nijman, S.A.
Marsman, J.B.C.
Veling, W.
author_sort Geraets, C.N.W.
collection PubMed
description BACKGROUND: Virtual reality (VR) enables the administration of realistic and dynamic stimuli within a social context for the assessment and training of emotion recognition. We tested a novel VR emotion recognition task by comparing emotion recognition across a VR, video and photo task, investigating covariates of recognition and exploring visual attention in VR. METHODS: Healthy individuals (n = 100) completed three emotion recognition tasks; a photo, video and VR task. During the VR task, emotions of virtual characters (avatars) in a VR street environment were rated, and eye-tracking was recorded in VR. RESULTS: Recognition accuracy in VR (overall 75%) was comparable to the photo and video task. However, there were some differences; disgust and happiness had lower accuracy rates in VR, and better accuracy was achieved for surprise and anger in VR compared to the video task. Participants spent more time identifying disgust, fear and sadness than surprise and happiness. In general, attention was directed longer to the eye and nose areas than the mouth. DISCUSSION: Immersive VR tasks can be used for training and assessment of emotion recognition. VR enables easily controllable avatars within environments relevant for daily life. Validated emotional expressions and tasks will be of relevance for clinical applications.
format Online
Article
Text
id pubmed-8350588
institution National Center for Biotechnology Information
language English
publishDate 2021
publisher Elsevier
record_format MEDLINE/PubMed
spelling pubmed-83505882021-08-15 Virtual reality facial emotion recognition in social environments: An eye-tracking study Geraets, C.N.W. Klein Tuente, S. Lestestuiver, B.P. van Beilen, M. Nijman, S.A. Marsman, J.B.C. Veling, W. Internet Interv Full length Article BACKGROUND: Virtual reality (VR) enables the administration of realistic and dynamic stimuli within a social context for the assessment and training of emotion recognition. We tested a novel VR emotion recognition task by comparing emotion recognition across a VR, video and photo task, investigating covariates of recognition and exploring visual attention in VR. METHODS: Healthy individuals (n = 100) completed three emotion recognition tasks; a photo, video and VR task. During the VR task, emotions of virtual characters (avatars) in a VR street environment were rated, and eye-tracking was recorded in VR. RESULTS: Recognition accuracy in VR (overall 75%) was comparable to the photo and video task. However, there were some differences; disgust and happiness had lower accuracy rates in VR, and better accuracy was achieved for surprise and anger in VR compared to the video task. Participants spent more time identifying disgust, fear and sadness than surprise and happiness. In general, attention was directed longer to the eye and nose areas than the mouth. DISCUSSION: Immersive VR tasks can be used for training and assessment of emotion recognition. VR enables easily controllable avatars within environments relevant for daily life. Validated emotional expressions and tasks will be of relevance for clinical applications. Elsevier 2021-07-17 /pmc/articles/PMC8350588/ /pubmed/34401391 http://dx.doi.org/10.1016/j.invent.2021.100432 Text en © 2021 The Authors https://creativecommons.org/licenses/by/4.0/This is an open access article under the CC BY license (http://creativecommons.org/licenses/by/4.0/).
spellingShingle Full length Article
Geraets, C.N.W.
Klein Tuente, S.
Lestestuiver, B.P.
van Beilen, M.
Nijman, S.A.
Marsman, J.B.C.
Veling, W.
Virtual reality facial emotion recognition in social environments: An eye-tracking study
title Virtual reality facial emotion recognition in social environments: An eye-tracking study
title_full Virtual reality facial emotion recognition in social environments: An eye-tracking study
title_fullStr Virtual reality facial emotion recognition in social environments: An eye-tracking study
title_full_unstemmed Virtual reality facial emotion recognition in social environments: An eye-tracking study
title_short Virtual reality facial emotion recognition in social environments: An eye-tracking study
title_sort virtual reality facial emotion recognition in social environments: an eye-tracking study
topic Full length Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8350588/
https://www.ncbi.nlm.nih.gov/pubmed/34401391
http://dx.doi.org/10.1016/j.invent.2021.100432
work_keys_str_mv AT geraetscnw virtualrealityfacialemotionrecognitioninsocialenvironmentsaneyetrackingstudy
AT kleintuentes virtualrealityfacialemotionrecognitioninsocialenvironmentsaneyetrackingstudy
AT lestestuiverbp virtualrealityfacialemotionrecognitioninsocialenvironmentsaneyetrackingstudy
AT vanbeilenm virtualrealityfacialemotionrecognitioninsocialenvironmentsaneyetrackingstudy
AT nijmansa virtualrealityfacialemotionrecognitioninsocialenvironmentsaneyetrackingstudy
AT marsmanjbc virtualrealityfacialemotionrecognitioninsocialenvironmentsaneyetrackingstudy
AT velingw virtualrealityfacialemotionrecognitioninsocialenvironmentsaneyetrackingstudy