Cargando…
Intention Understanding in Human–Robot Interaction Based on Visual-NLP Semantics
With the rapid development of robotic and AI technology in recent years, human–robot interaction has made great advancement, making practical social impact. Verbal commands are one of the most direct and frequently used means for human–robot interaction. Currently, such technology can enable robots...
Autores principales: | Li, Zhihao, Mu, Yishan, Sun, Zhenglong, Song, Sifan, Su, Jionglong, Zhang, Jiaming |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Frontiers Media S.A.
2021
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7888278/ https://www.ncbi.nlm.nih.gov/pubmed/33613223 http://dx.doi.org/10.3389/fnbot.2020.610139 |
Ejemplares similares
-
Semantic Representations for NLP Using VerbNet and the Generative Lexicon
por: Brown, Susan Windisch, et al.
Publicado: (2022) -
Intention-Related Natural Language Grounding via Object Affordance Detection and Intention Semantic Extraction
por: Mi, Jinpeng, et al.
Publicado: (2020) -
AppraisalCloudPCT: A Computational Model of Emotions for Socially Interactive Robots for Autistic Rehabilitation
por: Yan, Ting, et al.
Publicado: (2023) -
Learning Semantics of Gestural Instructions for Human-Robot Collaboration
por: Shukla, Dadhichi, et al.
Publicado: (2018) -
Human-Robot Interaction With Robust Prediction of Movement Intention Surpasses Manual Control
por: Veselic, Sebastijan, et al.
Publicado: (2021)