Cargando…

Audio-visual feedback improves the BCI performance in the navigational control of a humanoid robot

Advancement in brain computer interfaces (BCI) technology allows people to actively interact in the world through surrogates. Controlling real humanoid robots using BCI as intuitively as we control our body represents a challenge for current research in robotics and neuroscience. In order to success...

Descripción completa

Detalles Bibliográficos
Autores principales: Tidoni, Emmanuele, Gergondet, Pierre, Kheddar, Abderrahmane, Aglioti, Salvatore M.
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Frontiers Media S.A. 2014
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4060053/
https://www.ncbi.nlm.nih.gov/pubmed/24987350
http://dx.doi.org/10.3389/fnbot.2014.00020
_version_ 1782321313986117632
author Tidoni, Emmanuele
Gergondet, Pierre
Kheddar, Abderrahmane
Aglioti, Salvatore M.
author_facet Tidoni, Emmanuele
Gergondet, Pierre
Kheddar, Abderrahmane
Aglioti, Salvatore M.
author_sort Tidoni, Emmanuele
collection PubMed
description Advancement in brain computer interfaces (BCI) technology allows people to actively interact in the world through surrogates. Controlling real humanoid robots using BCI as intuitively as we control our body represents a challenge for current research in robotics and neuroscience. In order to successfully interact with the environment the brain integrates multiple sensory cues to form a coherent representation of the world. Cognitive neuroscience studies demonstrate that multisensory integration may imply a gain with respect to a single modality and ultimately improve the overall sensorimotor performance. For example, reactivity to simultaneous visual and auditory stimuli may be higher than to the sum of the same stimuli delivered in isolation or in temporal sequence. Yet, knowledge about whether audio-visual integration may improve the control of a surrogate is meager. To explore this issue, we provided human footstep sounds as audio feedback to BCI users while controlling a humanoid robot. Participants were asked to steer their robot surrogate and perform a pick-and-place task through BCI-SSVEPs. We found that audio-visual synchrony between footsteps sound and actual humanoid's walk reduces the time required for steering the robot. Thus, auditory feedback congruent with the humanoid actions may improve motor decisions of the BCI's user and help in the feeling of control over it. Our results shed light on the possibility to increase robot's control through the combination of multisensory feedback to a BCI user.
format Online
Article
Text
id pubmed-4060053
institution National Center for Biotechnology Information
language English
publishDate 2014
publisher Frontiers Media S.A.
record_format MEDLINE/PubMed
spelling pubmed-40600532014-07-01 Audio-visual feedback improves the BCI performance in the navigational control of a humanoid robot Tidoni, Emmanuele Gergondet, Pierre Kheddar, Abderrahmane Aglioti, Salvatore M. Front Neurorobot Neuroscience Advancement in brain computer interfaces (BCI) technology allows people to actively interact in the world through surrogates. Controlling real humanoid robots using BCI as intuitively as we control our body represents a challenge for current research in robotics and neuroscience. In order to successfully interact with the environment the brain integrates multiple sensory cues to form a coherent representation of the world. Cognitive neuroscience studies demonstrate that multisensory integration may imply a gain with respect to a single modality and ultimately improve the overall sensorimotor performance. For example, reactivity to simultaneous visual and auditory stimuli may be higher than to the sum of the same stimuli delivered in isolation or in temporal sequence. Yet, knowledge about whether audio-visual integration may improve the control of a surrogate is meager. To explore this issue, we provided human footstep sounds as audio feedback to BCI users while controlling a humanoid robot. Participants were asked to steer their robot surrogate and perform a pick-and-place task through BCI-SSVEPs. We found that audio-visual synchrony between footsteps sound and actual humanoid's walk reduces the time required for steering the robot. Thus, auditory feedback congruent with the humanoid actions may improve motor decisions of the BCI's user and help in the feeling of control over it. Our results shed light on the possibility to increase robot's control through the combination of multisensory feedback to a BCI user. Frontiers Media S.A. 2014-06-17 /pmc/articles/PMC4060053/ /pubmed/24987350 http://dx.doi.org/10.3389/fnbot.2014.00020 Text en Copyright © 2014 Tidoni, Gergondet, Kheddar and Aglioti. http://creativecommons.org/licenses/by/3.0/ This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
spellingShingle Neuroscience
Tidoni, Emmanuele
Gergondet, Pierre
Kheddar, Abderrahmane
Aglioti, Salvatore M.
Audio-visual feedback improves the BCI performance in the navigational control of a humanoid robot
title Audio-visual feedback improves the BCI performance in the navigational control of a humanoid robot
title_full Audio-visual feedback improves the BCI performance in the navigational control of a humanoid robot
title_fullStr Audio-visual feedback improves the BCI performance in the navigational control of a humanoid robot
title_full_unstemmed Audio-visual feedback improves the BCI performance in the navigational control of a humanoid robot
title_short Audio-visual feedback improves the BCI performance in the navigational control of a humanoid robot
title_sort audio-visual feedback improves the bci performance in the navigational control of a humanoid robot
topic Neuroscience
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4060053/
https://www.ncbi.nlm.nih.gov/pubmed/24987350
http://dx.doi.org/10.3389/fnbot.2014.00020
work_keys_str_mv AT tidoniemmanuele audiovisualfeedbackimprovesthebciperformanceinthenavigationalcontrolofahumanoidrobot
AT gergondetpierre audiovisualfeedbackimprovesthebciperformanceinthenavigationalcontrolofahumanoidrobot
AT kheddarabderrahmane audiovisualfeedbackimprovesthebciperformanceinthenavigationalcontrolofahumanoidrobot
AT agliotisalvatorem audiovisualfeedbackimprovesthebciperformanceinthenavigationalcontrolofahumanoidrobot