Cargando…
Integrating Verbal and Nonverbal Communication in a Dynamic Neural Field Architecture for Human–Robot Interaction
How do humans coordinate their intentions, goals and motor behaviors when performing joint action tasks? Recent experimental evidence suggests that resonance processes in the observer's motor system are crucially involved in our ability to understand actions of others’, to infer their goals and...
Autores principales: | , , |
---|---|
Formato: | Texto |
Lenguaje: | English |
Publicado: |
Frontiers Research Foundation
2010
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2901089/ https://www.ncbi.nlm.nih.gov/pubmed/20725504 http://dx.doi.org/10.3389/fnbot.2010.00005 |
_version_ | 1782183649777549312 |
---|---|
author | Bicho, Estela Louro, Luís Erlhagen, Wolfram |
author_facet | Bicho, Estela Louro, Luís Erlhagen, Wolfram |
author_sort | Bicho, Estela |
collection | PubMed |
description | How do humans coordinate their intentions, goals and motor behaviors when performing joint action tasks? Recent experimental evidence suggests that resonance processes in the observer's motor system are crucially involved in our ability to understand actions of others’, to infer their goals and even to comprehend their action-related language. In this paper, we present a control architecture for human–robot collaboration that exploits this close perception-action linkage as a means to achieve more natural and efficient communication grounded in sensorimotor experiences. The architecture is formalized by a coupled system of dynamic neural fields representing a distributed network of neural populations that encode in their activation patterns goals, actions and shared task knowledge. We validate the verbal and nonverbal communication skills of the robot in a joint assembly task in which the human–robot team has to construct toy objects from their components. The experiments focus on the robot's capacity to anticipate the user's needs and to detect and communicate unexpected events that may occur during joint task execution. |
format | Text |
id | pubmed-2901089 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2010 |
publisher | Frontiers Research Foundation |
record_format | MEDLINE/PubMed |
spelling | pubmed-29010892010-08-19 Integrating Verbal and Nonverbal Communication in a Dynamic Neural Field Architecture for Human–Robot Interaction Bicho, Estela Louro, Luís Erlhagen, Wolfram Front Neurorobotics Neuroscience How do humans coordinate their intentions, goals and motor behaviors when performing joint action tasks? Recent experimental evidence suggests that resonance processes in the observer's motor system are crucially involved in our ability to understand actions of others’, to infer their goals and even to comprehend their action-related language. In this paper, we present a control architecture for human–robot collaboration that exploits this close perception-action linkage as a means to achieve more natural and efficient communication grounded in sensorimotor experiences. The architecture is formalized by a coupled system of dynamic neural fields representing a distributed network of neural populations that encode in their activation patterns goals, actions and shared task knowledge. We validate the verbal and nonverbal communication skills of the robot in a joint assembly task in which the human–robot team has to construct toy objects from their components. The experiments focus on the robot's capacity to anticipate the user's needs and to detect and communicate unexpected events that may occur during joint task execution. Frontiers Research Foundation 2010-05-21 /pmc/articles/PMC2901089/ /pubmed/20725504 http://dx.doi.org/10.3389/fnbot.2010.00005 Text en Copyright © 2010 Bicho, Louro and Erlhagen. http://www.frontiersin.org/licenseagreement This is an open-access article subject to an exclusive license agreement between the authors and the Frontiers Research Foundation, which permits unrestricted use, distribution, and reproduction in any medium, provided the original authors and source are credited. |
spellingShingle | Neuroscience Bicho, Estela Louro, Luís Erlhagen, Wolfram Integrating Verbal and Nonverbal Communication in a Dynamic Neural Field Architecture for Human–Robot Interaction |
title | Integrating Verbal and Nonverbal Communication in a Dynamic Neural Field Architecture for Human–Robot Interaction |
title_full | Integrating Verbal and Nonverbal Communication in a Dynamic Neural Field Architecture for Human–Robot Interaction |
title_fullStr | Integrating Verbal and Nonverbal Communication in a Dynamic Neural Field Architecture for Human–Robot Interaction |
title_full_unstemmed | Integrating Verbal and Nonverbal Communication in a Dynamic Neural Field Architecture for Human–Robot Interaction |
title_short | Integrating Verbal and Nonverbal Communication in a Dynamic Neural Field Architecture for Human–Robot Interaction |
title_sort | integrating verbal and nonverbal communication in a dynamic neural field architecture for human–robot interaction |
topic | Neuroscience |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2901089/ https://www.ncbi.nlm.nih.gov/pubmed/20725504 http://dx.doi.org/10.3389/fnbot.2010.00005 |
work_keys_str_mv | AT bichoestela integratingverbalandnonverbalcommunicationinadynamicneuralfieldarchitectureforhumanrobotinteraction AT louroluis integratingverbalandnonverbalcommunicationinadynamicneuralfieldarchitectureforhumanrobotinteraction AT erlhagenwolfram integratingverbalandnonverbalcommunicationinadynamicneuralfieldarchitectureforhumanrobotinteraction |