Cargando…

Toward understanding social cues and signals in human–robot interaction: effects of robot gaze and proxemic behavior

As robots are increasingly deployed in settings requiring social interaction, research is needed to examine the social signals perceived by humans when robots display certain social cues. In this paper, we report a study designed to examine how humans interpret social cues exhibited by robots. We fi...

Descripción completa

Detalles Bibliográficos
Autores principales: Fiore, Stephen M., Wiltshire, Travis J., Lobato, Emilio J. C., Jentsch, Florian G., Huang, Wesley H., Axelrod, Benjamin
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Frontiers Media S.A. 2013
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3842160/
https://www.ncbi.nlm.nih.gov/pubmed/24348434
http://dx.doi.org/10.3389/fpsyg.2013.00859
_version_ 1782292897451737088
author Fiore, Stephen M.
Wiltshire, Travis J.
Lobato, Emilio J. C.
Jentsch, Florian G.
Huang, Wesley H.
Axelrod, Benjamin
author_facet Fiore, Stephen M.
Wiltshire, Travis J.
Lobato, Emilio J. C.
Jentsch, Florian G.
Huang, Wesley H.
Axelrod, Benjamin
author_sort Fiore, Stephen M.
collection PubMed
description As robots are increasingly deployed in settings requiring social interaction, research is needed to examine the social signals perceived by humans when robots display certain social cues. In this paper, we report a study designed to examine how humans interpret social cues exhibited by robots. We first provide a brief overview of perspectives from social cognition in humans and how these processes are applicable to human–robot interaction (HRI). We then discuss the need to examine the relationship between social cues and signals as a function of the degree to which a robot is perceived as a socially present agent. We describe an experiment in which social cues were manipulated on an iRobot Ava(TM) mobile robotics platform in a hallway navigation scenario. Cues associated with the robot’s proxemic behavior were found to significantly affect participant perceptions of the robot’s social presence and emotional state while cues associated with the robot’s gaze behavior were not found to be significant. Further, regardless of the proxemic behavior, participants attributed more social presence and emotional states to the robot over repeated interactions than when they first interacted with it. Generally, these results indicate the importance for HRI research to consider how social cues expressed by a robot can differentially affect perceptions of the robot’s mental states and intentions. The discussion focuses on implications for the design of robotic systems and future directions for research on the relationship between social cues and signals.
format Online
Article
Text
id pubmed-3842160
institution National Center for Biotechnology Information
language English
publishDate 2013
publisher Frontiers Media S.A.
record_format MEDLINE/PubMed
spelling pubmed-38421602013-12-13 Toward understanding social cues and signals in human–robot interaction: effects of robot gaze and proxemic behavior Fiore, Stephen M. Wiltshire, Travis J. Lobato, Emilio J. C. Jentsch, Florian G. Huang, Wesley H. Axelrod, Benjamin Front Psychol Psychology As robots are increasingly deployed in settings requiring social interaction, research is needed to examine the social signals perceived by humans when robots display certain social cues. In this paper, we report a study designed to examine how humans interpret social cues exhibited by robots. We first provide a brief overview of perspectives from social cognition in humans and how these processes are applicable to human–robot interaction (HRI). We then discuss the need to examine the relationship between social cues and signals as a function of the degree to which a robot is perceived as a socially present agent. We describe an experiment in which social cues were manipulated on an iRobot Ava(TM) mobile robotics platform in a hallway navigation scenario. Cues associated with the robot’s proxemic behavior were found to significantly affect participant perceptions of the robot’s social presence and emotional state while cues associated with the robot’s gaze behavior were not found to be significant. Further, regardless of the proxemic behavior, participants attributed more social presence and emotional states to the robot over repeated interactions than when they first interacted with it. Generally, these results indicate the importance for HRI research to consider how social cues expressed by a robot can differentially affect perceptions of the robot’s mental states and intentions. The discussion focuses on implications for the design of robotic systems and future directions for research on the relationship between social cues and signals. Frontiers Media S.A. 2013-11-27 /pmc/articles/PMC3842160/ /pubmed/24348434 http://dx.doi.org/10.3389/fpsyg.2013.00859 Text en Copyright © 2013 Fiore, Wiltshire, Lobato, Jentsch, Huang and Axelrod. http://creativecommons.org/licenses/by/3.0/ This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
spellingShingle Psychology
Fiore, Stephen M.
Wiltshire, Travis J.
Lobato, Emilio J. C.
Jentsch, Florian G.
Huang, Wesley H.
Axelrod, Benjamin
Toward understanding social cues and signals in human–robot interaction: effects of robot gaze and proxemic behavior
title Toward understanding social cues and signals in human–robot interaction: effects of robot gaze and proxemic behavior
title_full Toward understanding social cues and signals in human–robot interaction: effects of robot gaze and proxemic behavior
title_fullStr Toward understanding social cues and signals in human–robot interaction: effects of robot gaze and proxemic behavior
title_full_unstemmed Toward understanding social cues and signals in human–robot interaction: effects of robot gaze and proxemic behavior
title_short Toward understanding social cues and signals in human–robot interaction: effects of robot gaze and proxemic behavior
title_sort toward understanding social cues and signals in human–robot interaction: effects of robot gaze and proxemic behavior
topic Psychology
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3842160/
https://www.ncbi.nlm.nih.gov/pubmed/24348434
http://dx.doi.org/10.3389/fpsyg.2013.00859
work_keys_str_mv AT fiorestephenm towardunderstandingsocialcuesandsignalsinhumanrobotinteractioneffectsofrobotgazeandproxemicbehavior
AT wiltshiretravisj towardunderstandingsocialcuesandsignalsinhumanrobotinteractioneffectsofrobotgazeandproxemicbehavior
AT lobatoemiliojc towardunderstandingsocialcuesandsignalsinhumanrobotinteractioneffectsofrobotgazeandproxemicbehavior
AT jentschfloriang towardunderstandingsocialcuesandsignalsinhumanrobotinteractioneffectsofrobotgazeandproxemicbehavior
AT huangwesleyh towardunderstandingsocialcuesandsignalsinhumanrobotinteractioneffectsofrobotgazeandproxemicbehavior
AT axelrodbenjamin towardunderstandingsocialcuesandsignalsinhumanrobotinteractioneffectsofrobotgazeandproxemicbehavior