Cargando…
Brain-Inspired Coding of Robot Body Schema Through Visuo-Motor Integration of Touched Events
Representing objects in space is difficult because sensorimotor events are anchored in different reference frames, which can be either eye-, arm-, or target-centered. In the brain, Gain-Field (GF) neurons in the parietal cortex are involved in computing the necessary spatial transformations for alig...
Autores principales: | , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Frontiers Media S.A.
2019
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6416207/ https://www.ncbi.nlm.nih.gov/pubmed/30899217 http://dx.doi.org/10.3389/fnbot.2019.00005 |
_version_ | 1783403306572316672 |
---|---|
author | Pugach, Ganna Pitti, Alexandre Tolochko, Olga Gaussier, Philippe |
author_facet | Pugach, Ganna Pitti, Alexandre Tolochko, Olga Gaussier, Philippe |
author_sort | Pugach, Ganna |
collection | PubMed |
description | Representing objects in space is difficult because sensorimotor events are anchored in different reference frames, which can be either eye-, arm-, or target-centered. In the brain, Gain-Field (GF) neurons in the parietal cortex are involved in computing the necessary spatial transformations for aligning the tactile, visual and proprioceptive signals. In reaching tasks, these GF neurons exploit a mechanism based on multiplicative interaction for binding simultaneously touched events from the hand with visual and proprioception information.By doing so, they can infer new reference frames to represent dynamically the location of the body parts in the visual space (i.e., the body schema) and nearby targets (i.e., its peripersonal space). In this line, we propose a neural model based on GF neurons for integrating tactile events with arm postures and visual locations for constructing hand- and target-centered receptive fields in the visual space. In robotic experiments using an artificial skin, we show how our neural architecture reproduces the behaviors of parietal neurons (1) for encoding dynamically the body schema of our robotic arm without any visual tags on it and (2) for estimating the relative orientation and distance of targets to it. We demonstrate how tactile information facilitates the integration of visual and proprioceptive signals in order to construct the body space. |
format | Online Article Text |
id | pubmed-6416207 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2019 |
publisher | Frontiers Media S.A. |
record_format | MEDLINE/PubMed |
spelling | pubmed-64162072019-03-21 Brain-Inspired Coding of Robot Body Schema Through Visuo-Motor Integration of Touched Events Pugach, Ganna Pitti, Alexandre Tolochko, Olga Gaussier, Philippe Front Neurorobot Robotics and AI Representing objects in space is difficult because sensorimotor events are anchored in different reference frames, which can be either eye-, arm-, or target-centered. In the brain, Gain-Field (GF) neurons in the parietal cortex are involved in computing the necessary spatial transformations for aligning the tactile, visual and proprioceptive signals. In reaching tasks, these GF neurons exploit a mechanism based on multiplicative interaction for binding simultaneously touched events from the hand with visual and proprioception information.By doing so, they can infer new reference frames to represent dynamically the location of the body parts in the visual space (i.e., the body schema) and nearby targets (i.e., its peripersonal space). In this line, we propose a neural model based on GF neurons for integrating tactile events with arm postures and visual locations for constructing hand- and target-centered receptive fields in the visual space. In robotic experiments using an artificial skin, we show how our neural architecture reproduces the behaviors of parietal neurons (1) for encoding dynamically the body schema of our robotic arm without any visual tags on it and (2) for estimating the relative orientation and distance of targets to it. We demonstrate how tactile information facilitates the integration of visual and proprioceptive signals in order to construct the body space. Frontiers Media S.A. 2019-03-07 /pmc/articles/PMC6416207/ /pubmed/30899217 http://dx.doi.org/10.3389/fnbot.2019.00005 Text en Copyright © 2019 Pugach, Pitti, Tolochko and Gaussier. http://creativecommons.org/licenses/by/4.0/ This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms. |
spellingShingle | Robotics and AI Pugach, Ganna Pitti, Alexandre Tolochko, Olga Gaussier, Philippe Brain-Inspired Coding of Robot Body Schema Through Visuo-Motor Integration of Touched Events |
title | Brain-Inspired Coding of Robot Body Schema Through Visuo-Motor Integration of Touched Events |
title_full | Brain-Inspired Coding of Robot Body Schema Through Visuo-Motor Integration of Touched Events |
title_fullStr | Brain-Inspired Coding of Robot Body Schema Through Visuo-Motor Integration of Touched Events |
title_full_unstemmed | Brain-Inspired Coding of Robot Body Schema Through Visuo-Motor Integration of Touched Events |
title_short | Brain-Inspired Coding of Robot Body Schema Through Visuo-Motor Integration of Touched Events |
title_sort | brain-inspired coding of robot body schema through visuo-motor integration of touched events |
topic | Robotics and AI |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6416207/ https://www.ncbi.nlm.nih.gov/pubmed/30899217 http://dx.doi.org/10.3389/fnbot.2019.00005 |
work_keys_str_mv | AT pugachganna braininspiredcodingofrobotbodyschemathroughvisuomotorintegrationoftouchedevents AT pittialexandre braininspiredcodingofrobotbodyschemathroughvisuomotorintegrationoftouchedevents AT tolochkoolga braininspiredcodingofrobotbodyschemathroughvisuomotorintegrationoftouchedevents AT gaussierphilippe braininspiredcodingofrobotbodyschemathroughvisuomotorintegrationoftouchedevents |