Cargando…

Real-time emotion detection by quantitative facial motion analysis

BACKGROUND: Research into mood and emotion has often depended on slow and subjective self-report, highlighting a need for rapid, accurate, and objective assessment tools. METHODS: To address this gap, we developed a method using digital image speckle correlation (DISC), which tracks subtle changes i...

Descripción completa

Detalles Bibliográficos
Autores principales: Saadon, Jordan R., Yang, Fan, Burgert, Ryan, Mohammad, Selma, Gammel, Theresa, Sepe, Michael, Rafailovich, Miriam, Mikell, Charles B., Polak, Pawel, Mofakham, Sima
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Public Library of Science 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10004542/
https://www.ncbi.nlm.nih.gov/pubmed/36897921
http://dx.doi.org/10.1371/journal.pone.0282730
_version_ 1784904859232763904
author Saadon, Jordan R.
Yang, Fan
Burgert, Ryan
Mohammad, Selma
Gammel, Theresa
Sepe, Michael
Rafailovich, Miriam
Mikell, Charles B.
Polak, Pawel
Mofakham, Sima
author_facet Saadon, Jordan R.
Yang, Fan
Burgert, Ryan
Mohammad, Selma
Gammel, Theresa
Sepe, Michael
Rafailovich, Miriam
Mikell, Charles B.
Polak, Pawel
Mofakham, Sima
author_sort Saadon, Jordan R.
collection PubMed
description BACKGROUND: Research into mood and emotion has often depended on slow and subjective self-report, highlighting a need for rapid, accurate, and objective assessment tools. METHODS: To address this gap, we developed a method using digital image speckle correlation (DISC), which tracks subtle changes in facial expressions invisible to the naked eye, to assess emotions in real-time. We presented ten participants with visual stimuli triggering neutral, happy, and sad emotions and quantified their associated facial responses via detailed DISC analysis. RESULTS: We identified key alterations in facial expression (facial maps) that reliably signal changes in mood state across all individuals based on these data. Furthermore, principal component analysis of these facial maps identified regions associated with happy and sad emotions. Compared with commercial deep learning solutions that use individual images to detect facial expressions and classify emotions, such as Amazon Rekognition, our DISC-based classifiers utilize frame-to-frame changes. Our data show that DISC-based classifiers deliver substantially better predictions, and they are inherently free of racial or gender bias. LIMITATIONS: Our sample size was limited, and participants were aware their faces were recorded on video. Despite this, our results remained consistent across individuals. CONCLUSIONS: We demonstrate that DISC-based facial analysis can be used to reliably identify an individual’s emotion and may provide a robust and economic modality for real-time, noninvasive clinical monitoring in the future.
format Online
Article
Text
id pubmed-10004542
institution National Center for Biotechnology Information
language English
publishDate 2023
publisher Public Library of Science
record_format MEDLINE/PubMed
spelling pubmed-100045422023-03-11 Real-time emotion detection by quantitative facial motion analysis Saadon, Jordan R. Yang, Fan Burgert, Ryan Mohammad, Selma Gammel, Theresa Sepe, Michael Rafailovich, Miriam Mikell, Charles B. Polak, Pawel Mofakham, Sima PLoS One Research Article BACKGROUND: Research into mood and emotion has often depended on slow and subjective self-report, highlighting a need for rapid, accurate, and objective assessment tools. METHODS: To address this gap, we developed a method using digital image speckle correlation (DISC), which tracks subtle changes in facial expressions invisible to the naked eye, to assess emotions in real-time. We presented ten participants with visual stimuli triggering neutral, happy, and sad emotions and quantified their associated facial responses via detailed DISC analysis. RESULTS: We identified key alterations in facial expression (facial maps) that reliably signal changes in mood state across all individuals based on these data. Furthermore, principal component analysis of these facial maps identified regions associated with happy and sad emotions. Compared with commercial deep learning solutions that use individual images to detect facial expressions and classify emotions, such as Amazon Rekognition, our DISC-based classifiers utilize frame-to-frame changes. Our data show that DISC-based classifiers deliver substantially better predictions, and they are inherently free of racial or gender bias. LIMITATIONS: Our sample size was limited, and participants were aware their faces were recorded on video. Despite this, our results remained consistent across individuals. CONCLUSIONS: We demonstrate that DISC-based facial analysis can be used to reliably identify an individual’s emotion and may provide a robust and economic modality for real-time, noninvasive clinical monitoring in the future. Public Library of Science 2023-03-10 /pmc/articles/PMC10004542/ /pubmed/36897921 http://dx.doi.org/10.1371/journal.pone.0282730 Text en © 2023 Saadon et al https://creativecommons.org/licenses/by/4.0/This is an open access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/) , which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
spellingShingle Research Article
Saadon, Jordan R.
Yang, Fan
Burgert, Ryan
Mohammad, Selma
Gammel, Theresa
Sepe, Michael
Rafailovich, Miriam
Mikell, Charles B.
Polak, Pawel
Mofakham, Sima
Real-time emotion detection by quantitative facial motion analysis
title Real-time emotion detection by quantitative facial motion analysis
title_full Real-time emotion detection by quantitative facial motion analysis
title_fullStr Real-time emotion detection by quantitative facial motion analysis
title_full_unstemmed Real-time emotion detection by quantitative facial motion analysis
title_short Real-time emotion detection by quantitative facial motion analysis
title_sort real-time emotion detection by quantitative facial motion analysis
topic Research Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10004542/
https://www.ncbi.nlm.nih.gov/pubmed/36897921
http://dx.doi.org/10.1371/journal.pone.0282730
work_keys_str_mv AT saadonjordanr realtimeemotiondetectionbyquantitativefacialmotionanalysis
AT yangfan realtimeemotiondetectionbyquantitativefacialmotionanalysis
AT burgertryan realtimeemotiondetectionbyquantitativefacialmotionanalysis
AT mohammadselma realtimeemotiondetectionbyquantitativefacialmotionanalysis
AT gammeltheresa realtimeemotiondetectionbyquantitativefacialmotionanalysis
AT sepemichael realtimeemotiondetectionbyquantitativefacialmotionanalysis
AT rafailovichmiriam realtimeemotiondetectionbyquantitativefacialmotionanalysis
AT mikellcharlesb realtimeemotiondetectionbyquantitativefacialmotionanalysis
AT polakpawel realtimeemotiondetectionbyquantitativefacialmotionanalysis
AT mofakhamsima realtimeemotiondetectionbyquantitativefacialmotionanalysis