Cargando…

Enhancing Perception with Tactile Object Recognition in Adaptive Grippers for Human–Robot Interaction

The use of tactile perception can help first response robotic teams in disaster scenarios, where visibility conditions are often reduced due to the presence of dust, mud, or smoke, distinguishing human limbs from other objects with similar shapes. Here, the integration of the tactile sensor in adapt...

Descripción completa

Detalles Bibliográficos
Autores principales: Gandarias, Juan M., Gómez-de-Gabriel, Jesús M., García-Cerezo, Alfonso J.
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2018
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5876667/
https://www.ncbi.nlm.nih.gov/pubmed/29495409
http://dx.doi.org/10.3390/s18030692
_version_ 1783310558904188928
author Gandarias, Juan M.
Gómez-de-Gabriel, Jesús M.
García-Cerezo, Alfonso J.
author_facet Gandarias, Juan M.
Gómez-de-Gabriel, Jesús M.
García-Cerezo, Alfonso J.
author_sort Gandarias, Juan M.
collection PubMed
description The use of tactile perception can help first response robotic teams in disaster scenarios, where visibility conditions are often reduced due to the presence of dust, mud, or smoke, distinguishing human limbs from other objects with similar shapes. Here, the integration of the tactile sensor in adaptive grippers is evaluated, measuring the performance of an object recognition task based on deep convolutional neural networks (DCNNs) using a flexible sensor mounted in adaptive grippers. A total of 15 classes with 50 tactile images each were trained, including human body parts and common environment objects, in semi-rigid and flexible adaptive grippers based on the fin ray effect. The classifier was compared against the rigid configuration and a support vector machine classifier (SVM). Finally, a two-level output network has been proposed to provide both object-type recognition and human/non-human classification. Sensors in adaptive grippers have a higher number of non-null tactels (up to 37% more), with a lower mean of pressure values (up to 72% less) than when using a rigid sensor, with a softer grip, which is needed in physical human–robot interaction (pHRI). A semi-rigid implementation with 95.13% object recognition rate was chosen, even though the human/non-human classification had better results (98.78%) with a rigid sensor.
format Online
Article
Text
id pubmed-5876667
institution National Center for Biotechnology Information
language English
publishDate 2018
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-58766672018-04-09 Enhancing Perception with Tactile Object Recognition in Adaptive Grippers for Human–Robot Interaction Gandarias, Juan M. Gómez-de-Gabriel, Jesús M. García-Cerezo, Alfonso J. Sensors (Basel) Article The use of tactile perception can help first response robotic teams in disaster scenarios, where visibility conditions are often reduced due to the presence of dust, mud, or smoke, distinguishing human limbs from other objects with similar shapes. Here, the integration of the tactile sensor in adaptive grippers is evaluated, measuring the performance of an object recognition task based on deep convolutional neural networks (DCNNs) using a flexible sensor mounted in adaptive grippers. A total of 15 classes with 50 tactile images each were trained, including human body parts and common environment objects, in semi-rigid and flexible adaptive grippers based on the fin ray effect. The classifier was compared against the rigid configuration and a support vector machine classifier (SVM). Finally, a two-level output network has been proposed to provide both object-type recognition and human/non-human classification. Sensors in adaptive grippers have a higher number of non-null tactels (up to 37% more), with a lower mean of pressure values (up to 72% less) than when using a rigid sensor, with a softer grip, which is needed in physical human–robot interaction (pHRI). A semi-rigid implementation with 95.13% object recognition rate was chosen, even though the human/non-human classification had better results (98.78%) with a rigid sensor. MDPI 2018-02-26 /pmc/articles/PMC5876667/ /pubmed/29495409 http://dx.doi.org/10.3390/s18030692 Text en © 2018 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Gandarias, Juan M.
Gómez-de-Gabriel, Jesús M.
García-Cerezo, Alfonso J.
Enhancing Perception with Tactile Object Recognition in Adaptive Grippers for Human–Robot Interaction
title Enhancing Perception with Tactile Object Recognition in Adaptive Grippers for Human–Robot Interaction
title_full Enhancing Perception with Tactile Object Recognition in Adaptive Grippers for Human–Robot Interaction
title_fullStr Enhancing Perception with Tactile Object Recognition in Adaptive Grippers for Human–Robot Interaction
title_full_unstemmed Enhancing Perception with Tactile Object Recognition in Adaptive Grippers for Human–Robot Interaction
title_short Enhancing Perception with Tactile Object Recognition in Adaptive Grippers for Human–Robot Interaction
title_sort enhancing perception with tactile object recognition in adaptive grippers for human–robot interaction
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5876667/
https://www.ncbi.nlm.nih.gov/pubmed/29495409
http://dx.doi.org/10.3390/s18030692
work_keys_str_mv AT gandariasjuanm enhancingperceptionwithtactileobjectrecognitioninadaptivegrippersforhumanrobotinteraction
AT gomezdegabrieljesusm enhancingperceptionwithtactileobjectrecognitioninadaptivegrippersforhumanrobotinteraction
AT garciacerezoalfonsoj enhancingperceptionwithtactileobjectrecognitioninadaptivegrippersforhumanrobotinteraction