Cargando…

Combining Unsupervised and Supervised Learning for Sample Efficient Continuous Language Grounding

Natural and efficient communication with humans requires artificial agents that are able to understand the meaning of natural language. However, understanding natural language is non-trivial and requires proper grounding mechanisms to create links between words and corresponding perceptual informati...

Descripción completa

Detalles Bibliográficos
Autor principal: Roesler, Oliver
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Frontiers Media S.A. 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9563997/
https://www.ncbi.nlm.nih.gov/pubmed/36246495
http://dx.doi.org/10.3389/frobt.2022.701250
_version_ 1784808534212345856
author Roesler, Oliver
author_facet Roesler, Oliver
author_sort Roesler, Oliver
collection PubMed
description Natural and efficient communication with humans requires artificial agents that are able to understand the meaning of natural language. However, understanding natural language is non-trivial and requires proper grounding mechanisms to create links between words and corresponding perceptual information. Since the introduction of the “Symbol Grounding Problem” in 1990, many different grounding approaches have been proposed that either employed supervised or unsupervised learning mechanisms. The latter have the advantage that no other agent is required to learn the correct groundings, while the former are often more sample-efficient and accurate but require the support of another agent, like a human or another artificial agent. Although combining both paradigms seems natural, it has not achieved much attention. Therefore, this paper proposes a hybrid grounding framework which combines both learning paradigms so that it is able to utilize support from a tutor, if available, while it can still learn when no support is provided. Additionally, the framework has been designed to learn in a continuous and open-ended manner so that no explicit training phase is required. The proposed framework is evaluated through two different grounding scenarios and its unsupervised grounding component is compared to a state-of-the-art unsupervised Bayesian grounding framework, while the benefit of combining both paradigms is evaluated through the analysis of different feedback rates. The obtained results show that the employed unsupervised grounding mechanism outperforms the baseline in terms of accuracy, transparency, and deployability and that combining both paradigms increases both the sample-efficiency as well as the accuracy of purely unsupervised grounding, while it ensures that the framework is still able to learn the correct mappings, when no supervision is available.
format Online
Article
Text
id pubmed-9563997
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher Frontiers Media S.A.
record_format MEDLINE/PubMed
spelling pubmed-95639972022-10-15 Combining Unsupervised and Supervised Learning for Sample Efficient Continuous Language Grounding Roesler, Oliver Front Robot AI Robotics and AI Natural and efficient communication with humans requires artificial agents that are able to understand the meaning of natural language. However, understanding natural language is non-trivial and requires proper grounding mechanisms to create links between words and corresponding perceptual information. Since the introduction of the “Symbol Grounding Problem” in 1990, many different grounding approaches have been proposed that either employed supervised or unsupervised learning mechanisms. The latter have the advantage that no other agent is required to learn the correct groundings, while the former are often more sample-efficient and accurate but require the support of another agent, like a human or another artificial agent. Although combining both paradigms seems natural, it has not achieved much attention. Therefore, this paper proposes a hybrid grounding framework which combines both learning paradigms so that it is able to utilize support from a tutor, if available, while it can still learn when no support is provided. Additionally, the framework has been designed to learn in a continuous and open-ended manner so that no explicit training phase is required. The proposed framework is evaluated through two different grounding scenarios and its unsupervised grounding component is compared to a state-of-the-art unsupervised Bayesian grounding framework, while the benefit of combining both paradigms is evaluated through the analysis of different feedback rates. The obtained results show that the employed unsupervised grounding mechanism outperforms the baseline in terms of accuracy, transparency, and deployability and that combining both paradigms increases both the sample-efficiency as well as the accuracy of purely unsupervised grounding, while it ensures that the framework is still able to learn the correct mappings, when no supervision is available. Frontiers Media S.A. 2022-09-30 /pmc/articles/PMC9563997/ /pubmed/36246495 http://dx.doi.org/10.3389/frobt.2022.701250 Text en Copyright © 2022 Roesler. https://creativecommons.org/licenses/by/4.0/This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
spellingShingle Robotics and AI
Roesler, Oliver
Combining Unsupervised and Supervised Learning for Sample Efficient Continuous Language Grounding
title Combining Unsupervised and Supervised Learning for Sample Efficient Continuous Language Grounding
title_full Combining Unsupervised and Supervised Learning for Sample Efficient Continuous Language Grounding
title_fullStr Combining Unsupervised and Supervised Learning for Sample Efficient Continuous Language Grounding
title_full_unstemmed Combining Unsupervised and Supervised Learning for Sample Efficient Continuous Language Grounding
title_short Combining Unsupervised and Supervised Learning for Sample Efficient Continuous Language Grounding
title_sort combining unsupervised and supervised learning for sample efficient continuous language grounding
topic Robotics and AI
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9563997/
https://www.ncbi.nlm.nih.gov/pubmed/36246495
http://dx.doi.org/10.3389/frobt.2022.701250
work_keys_str_mv AT roesleroliver combiningunsupervisedandsupervisedlearningforsampleefficientcontinuouslanguagegrounding