Cargando…
Perception is Only Real When Shared: A Mathematical Model for Collaborative Shared Perception in Human-Robot Interaction
Partners have to build a shared understanding of their environment in everyday collaborative tasks by aligning their perceptions and establishing a common ground. This is one of the aims of shared perception: revealing characteristics of the individual perception to others with whom we share the sam...
Autores principales: | , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Frontiers Media S.A.
2022
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9240641/ https://www.ncbi.nlm.nih.gov/pubmed/35783020 http://dx.doi.org/10.3389/frobt.2022.733954 |
_version_ | 1784737611442552832 |
---|---|
author | Matarese, Marco Rea, Francesco Sciutti, Alessandra |
author_facet | Matarese, Marco Rea, Francesco Sciutti, Alessandra |
author_sort | Matarese, Marco |
collection | PubMed |
description | Partners have to build a shared understanding of their environment in everyday collaborative tasks by aligning their perceptions and establishing a common ground. This is one of the aims of shared perception: revealing characteristics of the individual perception to others with whom we share the same environment. In this regard, social cognitive processes, such as joint attention and perspective-taking, form a shared perception. From a Human-Robot Interaction (HRI) perspective, robots would benefit from the ability to establish shared perception with humans and a common understanding of the environment with their partners. In this work, we wanted to assess whether a robot, considering the differences in perception between itself and its partner, could be more effective in its helping role and to what extent this improves task completion and the interaction experience. For this purpose, we designed a mathematical model for a collaborative shared perception that aims to maximise the collaborators’ knowledge of the environment when there are asymmetries in perception. Moreover, we instantiated and tested our model via a real HRI scenario. The experiment consisted of a cooperative game in which participants had to build towers of Lego bricks, while the robot took the role of a suggester. In particular, we conducted experiments using two different robot behaviours. In one condition, based on shared perception, the robot gave suggestions by considering the partners’ point of view and using its inference about their common ground to select the most informative hint. In the other condition, the robot just indicated the brick that would have yielded a higher score from its individual perspective. The adoption of shared perception in the selection of suggestions led to better performances in all the instances of the game where the visual information was not a priori common to both agents. However, the subjective evaluation of the robot’s behaviour did not change between conditions. |
format | Online Article Text |
id | pubmed-9240641 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2022 |
publisher | Frontiers Media S.A. |
record_format | MEDLINE/PubMed |
spelling | pubmed-92406412022-06-30 Perception is Only Real When Shared: A Mathematical Model for Collaborative Shared Perception in Human-Robot Interaction Matarese, Marco Rea, Francesco Sciutti, Alessandra Front Robot AI Robotics and AI Partners have to build a shared understanding of their environment in everyday collaborative tasks by aligning their perceptions and establishing a common ground. This is one of the aims of shared perception: revealing characteristics of the individual perception to others with whom we share the same environment. In this regard, social cognitive processes, such as joint attention and perspective-taking, form a shared perception. From a Human-Robot Interaction (HRI) perspective, robots would benefit from the ability to establish shared perception with humans and a common understanding of the environment with their partners. In this work, we wanted to assess whether a robot, considering the differences in perception between itself and its partner, could be more effective in its helping role and to what extent this improves task completion and the interaction experience. For this purpose, we designed a mathematical model for a collaborative shared perception that aims to maximise the collaborators’ knowledge of the environment when there are asymmetries in perception. Moreover, we instantiated and tested our model via a real HRI scenario. The experiment consisted of a cooperative game in which participants had to build towers of Lego bricks, while the robot took the role of a suggester. In particular, we conducted experiments using two different robot behaviours. In one condition, based on shared perception, the robot gave suggestions by considering the partners’ point of view and using its inference about their common ground to select the most informative hint. In the other condition, the robot just indicated the brick that would have yielded a higher score from its individual perspective. The adoption of shared perception in the selection of suggestions led to better performances in all the instances of the game where the visual information was not a priori common to both agents. However, the subjective evaluation of the robot’s behaviour did not change between conditions. Frontiers Media S.A. 2022-06-15 /pmc/articles/PMC9240641/ /pubmed/35783020 http://dx.doi.org/10.3389/frobt.2022.733954 Text en Copyright © 2022 Matarese, Rea and Sciutti. https://creativecommons.org/licenses/by/4.0/This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms. |
spellingShingle | Robotics and AI Matarese, Marco Rea, Francesco Sciutti, Alessandra Perception is Only Real When Shared: A Mathematical Model for Collaborative Shared Perception in Human-Robot Interaction |
title | Perception is Only Real When Shared: A Mathematical Model for Collaborative Shared Perception in Human-Robot Interaction |
title_full | Perception is Only Real When Shared: A Mathematical Model for Collaborative Shared Perception in Human-Robot Interaction |
title_fullStr | Perception is Only Real When Shared: A Mathematical Model for Collaborative Shared Perception in Human-Robot Interaction |
title_full_unstemmed | Perception is Only Real When Shared: A Mathematical Model for Collaborative Shared Perception in Human-Robot Interaction |
title_short | Perception is Only Real When Shared: A Mathematical Model for Collaborative Shared Perception in Human-Robot Interaction |
title_sort | perception is only real when shared: a mathematical model for collaborative shared perception in human-robot interaction |
topic | Robotics and AI |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9240641/ https://www.ncbi.nlm.nih.gov/pubmed/35783020 http://dx.doi.org/10.3389/frobt.2022.733954 |
work_keys_str_mv | AT mataresemarco perceptionisonlyrealwhensharedamathematicalmodelforcollaborativesharedperceptioninhumanrobotinteraction AT reafrancesco perceptionisonlyrealwhensharedamathematicalmodelforcollaborativesharedperceptioninhumanrobotinteraction AT sciuttialessandra perceptionisonlyrealwhensharedamathematicalmodelforcollaborativesharedperceptioninhumanrobotinteraction |