Cargando…

Learning Actions From Natural Language Instructions Using an ON-World Embodied Cognitive Architecture

Endowing robots with the ability to view the world the way humans do, to understand natural language and to learn novel semantic meanings when they are deployed in the physical world, is a compelling problem. Another significant aspect is linking language to action, in particular, utterances involvi...

Descripción completa

Detalles Bibliográficos
Autores principales: Giorgi, Ioanna, Cangelosi, Angelo, Masala, Giovanni L.
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Frontiers Media S.A. 2021
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8155541/
https://www.ncbi.nlm.nih.gov/pubmed/34054452
http://dx.doi.org/10.3389/fnbot.2021.626380
_version_ 1783699228322693120
author Giorgi, Ioanna
Cangelosi, Angelo
Masala, Giovanni L.
author_facet Giorgi, Ioanna
Cangelosi, Angelo
Masala, Giovanni L.
author_sort Giorgi, Ioanna
collection PubMed
description Endowing robots with the ability to view the world the way humans do, to understand natural language and to learn novel semantic meanings when they are deployed in the physical world, is a compelling problem. Another significant aspect is linking language to action, in particular, utterances involving abstract words, in artificial agents. In this work, we propose a novel methodology, using a brain-inspired architecture, to model an appropriate mapping of language with the percept and internal motor representation in humanoid robots. This research presents the first robotic instantiation of a complex architecture based on the Baddeley's Working Memory (WM) model. Our proposed method grants a scalable knowledge representation of verbal and non-verbal signals in the cognitive architecture, which supports incremental open-ended learning. Human spoken utterances about the workspace and the task are combined with the internal knowledge map of the robot to achieve task accomplishment goals. We train the robot to understand instructions involving higher-order (abstract) linguistic concepts of developmental complexity, which cannot be directly hooked in the physical world and are not pre-defined in the robot's static self-representation. Our proposed interactive learning method grants flexible run-time acquisition of novel linguistic forms and real-world information, without training the cognitive model anew. Hence, the robot can adapt to new workspaces that include novel objects and task outcomes. We assess the potential of the proposed methodology in verification experiments with a humanoid robot. The obtained results suggest robust capabilities of the model to link language bi-directionally with the physical environment and solve a variety of manipulation tasks, starting with limited knowledge and gradually learning from the run-time interaction with the tutor, past the pre-trained stage.
format Online
Article
Text
id pubmed-8155541
institution National Center for Biotechnology Information
language English
publishDate 2021
publisher Frontiers Media S.A.
record_format MEDLINE/PubMed
spelling pubmed-81555412021-05-28 Learning Actions From Natural Language Instructions Using an ON-World Embodied Cognitive Architecture Giorgi, Ioanna Cangelosi, Angelo Masala, Giovanni L. Front Neurorobot Neuroscience Endowing robots with the ability to view the world the way humans do, to understand natural language and to learn novel semantic meanings when they are deployed in the physical world, is a compelling problem. Another significant aspect is linking language to action, in particular, utterances involving abstract words, in artificial agents. In this work, we propose a novel methodology, using a brain-inspired architecture, to model an appropriate mapping of language with the percept and internal motor representation in humanoid robots. This research presents the first robotic instantiation of a complex architecture based on the Baddeley's Working Memory (WM) model. Our proposed method grants a scalable knowledge representation of verbal and non-verbal signals in the cognitive architecture, which supports incremental open-ended learning. Human spoken utterances about the workspace and the task are combined with the internal knowledge map of the robot to achieve task accomplishment goals. We train the robot to understand instructions involving higher-order (abstract) linguistic concepts of developmental complexity, which cannot be directly hooked in the physical world and are not pre-defined in the robot's static self-representation. Our proposed interactive learning method grants flexible run-time acquisition of novel linguistic forms and real-world information, without training the cognitive model anew. Hence, the robot can adapt to new workspaces that include novel objects and task outcomes. We assess the potential of the proposed methodology in verification experiments with a humanoid robot. The obtained results suggest robust capabilities of the model to link language bi-directionally with the physical environment and solve a variety of manipulation tasks, starting with limited knowledge and gradually learning from the run-time interaction with the tutor, past the pre-trained stage. Frontiers Media S.A. 2021-05-13 /pmc/articles/PMC8155541/ /pubmed/34054452 http://dx.doi.org/10.3389/fnbot.2021.626380 Text en Copyright © 2021 Giorgi, Cangelosi and Masala. https://creativecommons.org/licenses/by/4.0/This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
spellingShingle Neuroscience
Giorgi, Ioanna
Cangelosi, Angelo
Masala, Giovanni L.
Learning Actions From Natural Language Instructions Using an ON-World Embodied Cognitive Architecture
title Learning Actions From Natural Language Instructions Using an ON-World Embodied Cognitive Architecture
title_full Learning Actions From Natural Language Instructions Using an ON-World Embodied Cognitive Architecture
title_fullStr Learning Actions From Natural Language Instructions Using an ON-World Embodied Cognitive Architecture
title_full_unstemmed Learning Actions From Natural Language Instructions Using an ON-World Embodied Cognitive Architecture
title_short Learning Actions From Natural Language Instructions Using an ON-World Embodied Cognitive Architecture
title_sort learning actions from natural language instructions using an on-world embodied cognitive architecture
topic Neuroscience
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8155541/
https://www.ncbi.nlm.nih.gov/pubmed/34054452
http://dx.doi.org/10.3389/fnbot.2021.626380
work_keys_str_mv AT giorgiioanna learningactionsfromnaturallanguageinstructionsusinganonworldembodiedcognitivearchitecture
AT cangelosiangelo learningactionsfromnaturallanguageinstructionsusinganonworldembodiedcognitivearchitecture
AT masalagiovannil learningactionsfromnaturallanguageinstructionsusinganonworldembodiedcognitivearchitecture