Cargando…

The Embodied Crossmodal Self Forms Language and Interaction: A Computational Cognitive Review

Human language is inherently embodied and grounded in sensorimotor representations of the self and the world around it. This suggests that the body schema and ideomotor action-effect associations play an important role in language understanding, language generation, and verbal/physical interaction w...

Descripción completa

Detalles Bibliográficos
Autores principales: Röder, Frank, Özdemir, Ozan, Nguyen, Phuong D. H., Wermter, Stefan, Eppe, Manfred
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Frontiers Media S.A. 2021
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8415221/
https://www.ncbi.nlm.nih.gov/pubmed/34484079
http://dx.doi.org/10.3389/fpsyg.2021.716671
_version_ 1783747923736002560
author Röder, Frank
Özdemir, Ozan
Nguyen, Phuong D. H.
Wermter, Stefan
Eppe, Manfred
author_facet Röder, Frank
Özdemir, Ozan
Nguyen, Phuong D. H.
Wermter, Stefan
Eppe, Manfred
author_sort Röder, Frank
collection PubMed
description Human language is inherently embodied and grounded in sensorimotor representations of the self and the world around it. This suggests that the body schema and ideomotor action-effect associations play an important role in language understanding, language generation, and verbal/physical interaction with others. There are computational models that focus purely on non-verbal interaction between humans and robots, and there are computational models for dialog systems that focus only on verbal interaction. However, there is a lack of research that integrates these approaches. We hypothesize that the development of computational models of the self is very appropriate for considering joint verbal and physical interaction. Therefore, they provide the substantial potential to foster the psychological and cognitive understanding of language grounding, and they have significant potential to improve human-robot interaction methods and applications. This review is a first step toward developing models of the self that integrate verbal and non-verbal communication. To this end, we first analyze the relevant findings and mechanisms for language grounding in the psychological and cognitive literature on ideomotor theory. Second, we identify the existing computational methods that implement physical decision-making and verbal interaction. As a result, we outline how the current computational methods can be used to create advanced computational interaction models that integrate language grounding with body schemas and self-representations.
format Online
Article
Text
id pubmed-8415221
institution National Center for Biotechnology Information
language English
publishDate 2021
publisher Frontiers Media S.A.
record_format MEDLINE/PubMed
spelling pubmed-84152212021-09-04 The Embodied Crossmodal Self Forms Language and Interaction: A Computational Cognitive Review Röder, Frank Özdemir, Ozan Nguyen, Phuong D. H. Wermter, Stefan Eppe, Manfred Front Psychol Psychology Human language is inherently embodied and grounded in sensorimotor representations of the self and the world around it. This suggests that the body schema and ideomotor action-effect associations play an important role in language understanding, language generation, and verbal/physical interaction with others. There are computational models that focus purely on non-verbal interaction between humans and robots, and there are computational models for dialog systems that focus only on verbal interaction. However, there is a lack of research that integrates these approaches. We hypothesize that the development of computational models of the self is very appropriate for considering joint verbal and physical interaction. Therefore, they provide the substantial potential to foster the psychological and cognitive understanding of language grounding, and they have significant potential to improve human-robot interaction methods and applications. This review is a first step toward developing models of the self that integrate verbal and non-verbal communication. To this end, we first analyze the relevant findings and mechanisms for language grounding in the psychological and cognitive literature on ideomotor theory. Second, we identify the existing computational methods that implement physical decision-making and verbal interaction. As a result, we outline how the current computational methods can be used to create advanced computational interaction models that integrate language grounding with body schemas and self-representations. Frontiers Media S.A. 2021-08-16 /pmc/articles/PMC8415221/ /pubmed/34484079 http://dx.doi.org/10.3389/fpsyg.2021.716671 Text en Copyright © 2021 Röder, Özdemir, Nguyen, Wermter and Eppe. https://creativecommons.org/licenses/by/4.0/This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
spellingShingle Psychology
Röder, Frank
Özdemir, Ozan
Nguyen, Phuong D. H.
Wermter, Stefan
Eppe, Manfred
The Embodied Crossmodal Self Forms Language and Interaction: A Computational Cognitive Review
title The Embodied Crossmodal Self Forms Language and Interaction: A Computational Cognitive Review
title_full The Embodied Crossmodal Self Forms Language and Interaction: A Computational Cognitive Review
title_fullStr The Embodied Crossmodal Self Forms Language and Interaction: A Computational Cognitive Review
title_full_unstemmed The Embodied Crossmodal Self Forms Language and Interaction: A Computational Cognitive Review
title_short The Embodied Crossmodal Self Forms Language and Interaction: A Computational Cognitive Review
title_sort embodied crossmodal self forms language and interaction: a computational cognitive review
topic Psychology
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8415221/
https://www.ncbi.nlm.nih.gov/pubmed/34484079
http://dx.doi.org/10.3389/fpsyg.2021.716671
work_keys_str_mv AT roderfrank theembodiedcrossmodalselfformslanguageandinteractionacomputationalcognitivereview
AT ozdemirozan theembodiedcrossmodalselfformslanguageandinteractionacomputationalcognitivereview
AT nguyenphuongdh theembodiedcrossmodalselfformslanguageandinteractionacomputationalcognitivereview
AT wermterstefan theembodiedcrossmodalselfformslanguageandinteractionacomputationalcognitivereview
AT eppemanfred theembodiedcrossmodalselfformslanguageandinteractionacomputationalcognitivereview
AT roderfrank embodiedcrossmodalselfformslanguageandinteractionacomputationalcognitivereview
AT ozdemirozan embodiedcrossmodalselfformslanguageandinteractionacomputationalcognitivereview
AT nguyenphuongdh embodiedcrossmodalselfformslanguageandinteractionacomputationalcognitivereview
AT wermterstefan embodiedcrossmodalselfformslanguageandinteractionacomputationalcognitivereview
AT eppemanfred embodiedcrossmodalselfformslanguageandinteractionacomputationalcognitivereview