Cargando…
Cross-Modal Reconstruction for Tactile Signal in Human–Robot Interaction
A human can infer the magnitude of interaction force solely based on visual information because of prior knowledge in human–robot interaction (HRI). A method of reconstructing tactile information through cross-modal signal processing is proposed in this paper. In our method, visual information is ad...
Autores principales: | Chen, Mingkai, Xie, Yu |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
MDPI
2022
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9460542/ https://www.ncbi.nlm.nih.gov/pubmed/36080977 http://dx.doi.org/10.3390/s22176517 |
Ejemplares similares
-
Spatial Calibration of Humanoid Robot Flexible Tactile Skin for Human–Robot Interaction
por: Chefchaouni Moussaoui, Sélim, et al.
Publicado: (2023) -
Behavioural Models of Risk-Taking in Human–Robot Tactile Interactions
por: Ren, Qiaoqiao, et al.
Publicado: (2023) -
BioIn-Tacto: A compliant multi-modal tactile sensing module for robotic tasks
por: Alves de Oliveira, Thiago Eustaquio, et al.
Publicado: (2023) -
Cross-Modal Sensory Integration of Visual-Tactile Motion Information: Instrument Design and Human Psychophysics
por: Pei, Yu-Cheng, et al.
Publicado: (2013) -
Enhancing Perception with Tactile Object Recognition in Adaptive Grippers for Human–Robot Interaction
por: Gandarias, Juan M., et al.
Publicado: (2018)