Cargando…

In our own image? Emotional and neural processing differences when observing human–human vs human–robot interactions

Notwithstanding the significant role that human–robot interactions (HRI) will play in the near future, limited research has explored the neural correlates of feeling eerie in response to social robots. To address this empirical lacuna, the current investigation examined brain activity using function...

Descripción completa

Detalles Bibliográficos
Autores principales: Wang, Yin, Quadflieg, Susanne
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Oxford University Press 2015
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4631149/
https://www.ncbi.nlm.nih.gov/pubmed/25911418
http://dx.doi.org/10.1093/scan/nsv043
_version_ 1782398820831723520
author Wang, Yin
Quadflieg, Susanne
author_facet Wang, Yin
Quadflieg, Susanne
author_sort Wang, Yin
collection PubMed
description Notwithstanding the significant role that human–robot interactions (HRI) will play in the near future, limited research has explored the neural correlates of feeling eerie in response to social robots. To address this empirical lacuna, the current investigation examined brain activity using functional magnetic resonance imaging while a group of participants (n = 26) viewed a series of human–human interactions (HHI) and HRI. Although brain sites constituting the mentalizing network were found to respond to both types of interactions, systematic neural variation across sites signaled diverging social-cognitive strategies during HHI and HRI processing. Specifically, HHI elicited increased activity in the left temporal–parietal junction indicative of situation-specific mental state attributions, whereas HRI recruited the precuneus and the ventromedial prefrontal cortex (VMPFC) suggestive of script-based social reasoning. Activity in the VMPFC also tracked feelings of eeriness towards HRI in a parametric manner, revealing a potential neural correlate for a phenomenon known as the uncanny valley. By demonstrating how understanding social interactions depends on the kind of agents involved, this study highlights pivotal sub-routes of impression formation and identifies prominent challenges in the use of humanoid robots.
format Online
Article
Text
id pubmed-4631149
institution National Center for Biotechnology Information
language English
publishDate 2015
publisher Oxford University Press
record_format MEDLINE/PubMed
spelling pubmed-46311492015-11-12 In our own image? Emotional and neural processing differences when observing human–human vs human–robot interactions Wang, Yin Quadflieg, Susanne Soc Cogn Affect Neurosci Original Articles Notwithstanding the significant role that human–robot interactions (HRI) will play in the near future, limited research has explored the neural correlates of feeling eerie in response to social robots. To address this empirical lacuna, the current investigation examined brain activity using functional magnetic resonance imaging while a group of participants (n = 26) viewed a series of human–human interactions (HHI) and HRI. Although brain sites constituting the mentalizing network were found to respond to both types of interactions, systematic neural variation across sites signaled diverging social-cognitive strategies during HHI and HRI processing. Specifically, HHI elicited increased activity in the left temporal–parietal junction indicative of situation-specific mental state attributions, whereas HRI recruited the precuneus and the ventromedial prefrontal cortex (VMPFC) suggestive of script-based social reasoning. Activity in the VMPFC also tracked feelings of eeriness towards HRI in a parametric manner, revealing a potential neural correlate for a phenomenon known as the uncanny valley. By demonstrating how understanding social interactions depends on the kind of agents involved, this study highlights pivotal sub-routes of impression formation and identifies prominent challenges in the use of humanoid robots. Oxford University Press 2015-11 2015-04-23 /pmc/articles/PMC4631149/ /pubmed/25911418 http://dx.doi.org/10.1093/scan/nsv043 Text en © The Author (2015). Published by Oxford University Press. http://creativecommons.org/licenses/by/4.0/ This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted reuse, distribution, and reproduction in any medium, provided the original work is properly cited.
spellingShingle Original Articles
Wang, Yin
Quadflieg, Susanne
In our own image? Emotional and neural processing differences when observing human–human vs human–robot interactions
title In our own image? Emotional and neural processing differences when observing human–human vs human–robot interactions
title_full In our own image? Emotional and neural processing differences when observing human–human vs human–robot interactions
title_fullStr In our own image? Emotional and neural processing differences when observing human–human vs human–robot interactions
title_full_unstemmed In our own image? Emotional and neural processing differences when observing human–human vs human–robot interactions
title_short In our own image? Emotional and neural processing differences when observing human–human vs human–robot interactions
title_sort in our own image? emotional and neural processing differences when observing human–human vs human–robot interactions
topic Original Articles
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4631149/
https://www.ncbi.nlm.nih.gov/pubmed/25911418
http://dx.doi.org/10.1093/scan/nsv043
work_keys_str_mv AT wangyin inourownimageemotionalandneuralprocessingdifferenceswhenobservinghumanhumanvshumanrobotinteractions
AT quadfliegsusanne inourownimageemotionalandneuralprocessingdifferenceswhenobservinghumanhumanvshumanrobotinteractions