Cargando…

A Framework for Sensorimotor Cross-Perception and Cross-Behavior Knowledge Transfer for Object Categorization

From an early age, humans learn to develop an intuition for the physical nature of the objects around them by using exploratory behaviors. Such exploration provides observations of how objects feel, sound, look, and move as a result of actions applied on them. Previous works in robotics have shown t...

Descripción completa

Detalles Bibliográficos
Autores principales: Tatiya, Gyan, Hosseini, Ramtin, Hughes, Michael C., Sinapov, Jivko
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Frontiers Media S.A. 2020
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7805839/
https://www.ncbi.nlm.nih.gov/pubmed/33501303
http://dx.doi.org/10.3389/frobt.2020.522141
_version_ 1783636392427913216
author Tatiya, Gyan
Hosseini, Ramtin
Hughes, Michael C.
Sinapov, Jivko
author_facet Tatiya, Gyan
Hosseini, Ramtin
Hughes, Michael C.
Sinapov, Jivko
author_sort Tatiya, Gyan
collection PubMed
description From an early age, humans learn to develop an intuition for the physical nature of the objects around them by using exploratory behaviors. Such exploration provides observations of how objects feel, sound, look, and move as a result of actions applied on them. Previous works in robotics have shown that robots can also use such behaviors (e.g., lifting, pressing, shaking) to infer object properties that camera input alone cannot detect. Such learned representations are specific to each individual robot and cannot currently be transferred directly to another robot with different sensors and actions. Moreover, sensor failure can cause a robot to lose a specific sensory modality which may prevent it from using perceptual models that require it as input. To address these limitations, we propose a framework for knowledge transfer across behaviors and sensory modalities such that: (1) knowledge can be transferred from one or more robots to another, and, (2) knowledge can be transferred from one or more sensory modalities to another. We propose two different models for transfer based on variational auto-encoders and encoder-decoder networks. The main hypothesis behind our approach is that if two or more robots share multi-sensory object observations of a shared set of objects, then those observations can be used to establish mappings between multiple features spaces, each corresponding to a combination of an exploratory behavior and a sensory modality. We evaluate our approach on a category recognition task using a dataset in which a robot used 9 behaviors, coupled with 4 sensory modalities, performed multiple times on 100 objects. The results indicate that sensorimotor knowledge about objects can be transferred both across behaviors and across sensory modalities, such that a new robot (or the same robot, but with a different set of sensors) can bootstrap its category recognition models without having to exhaustively explore the full set of objects.
format Online
Article
Text
id pubmed-7805839
institution National Center for Biotechnology Information
language English
publishDate 2020
publisher Frontiers Media S.A.
record_format MEDLINE/PubMed
spelling pubmed-78058392021-01-25 A Framework for Sensorimotor Cross-Perception and Cross-Behavior Knowledge Transfer for Object Categorization Tatiya, Gyan Hosseini, Ramtin Hughes, Michael C. Sinapov, Jivko Front Robot AI Robotics and AI From an early age, humans learn to develop an intuition for the physical nature of the objects around them by using exploratory behaviors. Such exploration provides observations of how objects feel, sound, look, and move as a result of actions applied on them. Previous works in robotics have shown that robots can also use such behaviors (e.g., lifting, pressing, shaking) to infer object properties that camera input alone cannot detect. Such learned representations are specific to each individual robot and cannot currently be transferred directly to another robot with different sensors and actions. Moreover, sensor failure can cause a robot to lose a specific sensory modality which may prevent it from using perceptual models that require it as input. To address these limitations, we propose a framework for knowledge transfer across behaviors and sensory modalities such that: (1) knowledge can be transferred from one or more robots to another, and, (2) knowledge can be transferred from one or more sensory modalities to another. We propose two different models for transfer based on variational auto-encoders and encoder-decoder networks. The main hypothesis behind our approach is that if two or more robots share multi-sensory object observations of a shared set of objects, then those observations can be used to establish mappings between multiple features spaces, each corresponding to a combination of an exploratory behavior and a sensory modality. We evaluate our approach on a category recognition task using a dataset in which a robot used 9 behaviors, coupled with 4 sensory modalities, performed multiple times on 100 objects. The results indicate that sensorimotor knowledge about objects can be transferred both across behaviors and across sensory modalities, such that a new robot (or the same robot, but with a different set of sensors) can bootstrap its category recognition models without having to exhaustively explore the full set of objects. Frontiers Media S.A. 2020-10-09 /pmc/articles/PMC7805839/ /pubmed/33501303 http://dx.doi.org/10.3389/frobt.2020.522141 Text en Copyright © 2020 Tatiya, Hosseini, Hughes and Sinapov. http://creativecommons.org/licenses/by/4.0/ This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
spellingShingle Robotics and AI
Tatiya, Gyan
Hosseini, Ramtin
Hughes, Michael C.
Sinapov, Jivko
A Framework for Sensorimotor Cross-Perception and Cross-Behavior Knowledge Transfer for Object Categorization
title A Framework for Sensorimotor Cross-Perception and Cross-Behavior Knowledge Transfer for Object Categorization
title_full A Framework for Sensorimotor Cross-Perception and Cross-Behavior Knowledge Transfer for Object Categorization
title_fullStr A Framework for Sensorimotor Cross-Perception and Cross-Behavior Knowledge Transfer for Object Categorization
title_full_unstemmed A Framework for Sensorimotor Cross-Perception and Cross-Behavior Knowledge Transfer for Object Categorization
title_short A Framework for Sensorimotor Cross-Perception and Cross-Behavior Knowledge Transfer for Object Categorization
title_sort framework for sensorimotor cross-perception and cross-behavior knowledge transfer for object categorization
topic Robotics and AI
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7805839/
https://www.ncbi.nlm.nih.gov/pubmed/33501303
http://dx.doi.org/10.3389/frobt.2020.522141
work_keys_str_mv AT tatiyagyan aframeworkforsensorimotorcrossperceptionandcrossbehaviorknowledgetransferforobjectcategorization
AT hosseiniramtin aframeworkforsensorimotorcrossperceptionandcrossbehaviorknowledgetransferforobjectcategorization
AT hughesmichaelc aframeworkforsensorimotorcrossperceptionandcrossbehaviorknowledgetransferforobjectcategorization
AT sinapovjivko aframeworkforsensorimotorcrossperceptionandcrossbehaviorknowledgetransferforobjectcategorization
AT tatiyagyan frameworkforsensorimotorcrossperceptionandcrossbehaviorknowledgetransferforobjectcategorization
AT hosseiniramtin frameworkforsensorimotorcrossperceptionandcrossbehaviorknowledgetransferforobjectcategorization
AT hughesmichaelc frameworkforsensorimotorcrossperceptionandcrossbehaviorknowledgetransferforobjectcategorization
AT sinapovjivko frameworkforsensorimotorcrossperceptionandcrossbehaviorknowledgetransferforobjectcategorization