Cargando…

Recognition of Empathy from Synchronization between Brain Activity and Eye Movement

In the era of user-generated content (UGC) and virtual interactions within the metaverse, empathic digital content has become increasingly important. This study aimed to quantify human empathy levels when exposed to digital media. To assess empathy, we analyzed brain wave activity and eye movements...

Descripción completa

Detalles Bibliográficos
Autores principales: Zhang, Jing, Park, Sung, Cho, Ayoung, Whang, Mincheol
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10255460/
https://www.ncbi.nlm.nih.gov/pubmed/37299888
http://dx.doi.org/10.3390/s23115162
_version_ 1785056877388759040
author Zhang, Jing
Park, Sung
Cho, Ayoung
Whang, Mincheol
author_facet Zhang, Jing
Park, Sung
Cho, Ayoung
Whang, Mincheol
author_sort Zhang, Jing
collection PubMed
description In the era of user-generated content (UGC) and virtual interactions within the metaverse, empathic digital content has become increasingly important. This study aimed to quantify human empathy levels when exposed to digital media. To assess empathy, we analyzed brain wave activity and eye movements in response to emotional videos. Forty-seven participants watched eight emotional videos, and we collected their brain activity and eye movement data during the viewing. After each video session, participants provided subjective evaluations. Our analysis focused on the relationship between brain activity and eye movement in recognizing empathy. The findings revealed the following: (1) Participants were more inclined to empathize with videos depicting pleasant-arousal and unpleasant-relaxed emotions. (2) Saccades and fixation, key components of eye movement, occurred simultaneously with specific channels in the prefrontal and temporal lobes. (3) Eigenvalues of brain activity and pupil changes showed synchronization between the right pupil and certain channels in the prefrontal, parietal, and temporal lobes during empathic responses. These results suggest that eye movement characteristics can serve as an indicator of the cognitive empathic process when engaging with digital content. Furthermore, the observed changes in pupil size result from a combination of emotional and cognitive empathy elicited by the videos.
format Online
Article
Text
id pubmed-10255460
institution National Center for Biotechnology Information
language English
publishDate 2023
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-102554602023-06-10 Recognition of Empathy from Synchronization between Brain Activity and Eye Movement Zhang, Jing Park, Sung Cho, Ayoung Whang, Mincheol Sensors (Basel) Article In the era of user-generated content (UGC) and virtual interactions within the metaverse, empathic digital content has become increasingly important. This study aimed to quantify human empathy levels when exposed to digital media. To assess empathy, we analyzed brain wave activity and eye movements in response to emotional videos. Forty-seven participants watched eight emotional videos, and we collected their brain activity and eye movement data during the viewing. After each video session, participants provided subjective evaluations. Our analysis focused on the relationship between brain activity and eye movement in recognizing empathy. The findings revealed the following: (1) Participants were more inclined to empathize with videos depicting pleasant-arousal and unpleasant-relaxed emotions. (2) Saccades and fixation, key components of eye movement, occurred simultaneously with specific channels in the prefrontal and temporal lobes. (3) Eigenvalues of brain activity and pupil changes showed synchronization between the right pupil and certain channels in the prefrontal, parietal, and temporal lobes during empathic responses. These results suggest that eye movement characteristics can serve as an indicator of the cognitive empathic process when engaging with digital content. Furthermore, the observed changes in pupil size result from a combination of emotional and cognitive empathy elicited by the videos. MDPI 2023-05-29 /pmc/articles/PMC10255460/ /pubmed/37299888 http://dx.doi.org/10.3390/s23115162 Text en © 2023 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Zhang, Jing
Park, Sung
Cho, Ayoung
Whang, Mincheol
Recognition of Empathy from Synchronization between Brain Activity and Eye Movement
title Recognition of Empathy from Synchronization between Brain Activity and Eye Movement
title_full Recognition of Empathy from Synchronization between Brain Activity and Eye Movement
title_fullStr Recognition of Empathy from Synchronization between Brain Activity and Eye Movement
title_full_unstemmed Recognition of Empathy from Synchronization between Brain Activity and Eye Movement
title_short Recognition of Empathy from Synchronization between Brain Activity and Eye Movement
title_sort recognition of empathy from synchronization between brain activity and eye movement
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10255460/
https://www.ncbi.nlm.nih.gov/pubmed/37299888
http://dx.doi.org/10.3390/s23115162
work_keys_str_mv AT zhangjing recognitionofempathyfromsynchronizationbetweenbrainactivityandeyemovement
AT parksung recognitionofempathyfromsynchronizationbetweenbrainactivityandeyemovement
AT choayoung recognitionofempathyfromsynchronizationbetweenbrainactivityandeyemovement
AT whangmincheol recognitionofempathyfromsynchronizationbetweenbrainactivityandeyemovement