Cargando…

Dataset with Tactile and Kinesthetic Information from a Human Forearm and Its Application to Deep Learning

There are physical Human–Robot Interaction (pHRI) applications where the robot has to grab the human body, such as rescue or assistive robotics. Being able to precisely estimate the grasping location when grabbing a human limb is crucial to perform a safe manipulation of the human. Computer vision m...

Descripción completa

Detalles Bibliográficos
Autores principales: Pastor, Francisco, Lin-Yang, Da-hui, Gómez-de-Gabriel, Jesús M., García-Cerezo, Alfonso J.
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9696784/
https://www.ncbi.nlm.nih.gov/pubmed/36433347
http://dx.doi.org/10.3390/s22228752
_version_ 1784838393959546880
author Pastor, Francisco
Lin-Yang, Da-hui
Gómez-de-Gabriel, Jesús M.
García-Cerezo, Alfonso J.
author_facet Pastor, Francisco
Lin-Yang, Da-hui
Gómez-de-Gabriel, Jesús M.
García-Cerezo, Alfonso J.
author_sort Pastor, Francisco
collection PubMed
description There are physical Human–Robot Interaction (pHRI) applications where the robot has to grab the human body, such as rescue or assistive robotics. Being able to precisely estimate the grasping location when grabbing a human limb is crucial to perform a safe manipulation of the human. Computer vision methods provide pre-grasp information with strong constraints imposed by the field environments. Force-based compliant control, after grasping, limits the amount of applied strength. On the other hand, valuable tactile and proprioceptive information can be obtained from the pHRI gripper, which can be used to better know the features of the human and the contact state between the human and the robot. This paper presents a novel dataset of tactile and kinesthetic data obtained from a robot gripper that grabs a human forearm. The dataset is collected with a three-fingered gripper with two underactuated fingers and a fixed finger with a high-resolution tactile sensor. A palpation procedure is performed to record the shape of the forearm and to recognize the bones and muscles in different sections. Moreover, an application for the use of the database is included. In particular, a fusion approach is used to estimate the actual grasped forearm section using both kinesthetic and tactile information on a regression deep-learning neural network. First, tactile and kinesthetic data are trained separately with Long Short-Term Memory (LSTM) neural networks, considering the data are sequential. Then, the outputs are fed to a Fusion neural network to enhance the estimation. The experiments conducted show good results in training both sources separately, with superior performance when the fusion approach is considered.
format Online
Article
Text
id pubmed-9696784
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-96967842022-11-26 Dataset with Tactile and Kinesthetic Information from a Human Forearm and Its Application to Deep Learning Pastor, Francisco Lin-Yang, Da-hui Gómez-de-Gabriel, Jesús M. García-Cerezo, Alfonso J. Sensors (Basel) Article There are physical Human–Robot Interaction (pHRI) applications where the robot has to grab the human body, such as rescue or assistive robotics. Being able to precisely estimate the grasping location when grabbing a human limb is crucial to perform a safe manipulation of the human. Computer vision methods provide pre-grasp information with strong constraints imposed by the field environments. Force-based compliant control, after grasping, limits the amount of applied strength. On the other hand, valuable tactile and proprioceptive information can be obtained from the pHRI gripper, which can be used to better know the features of the human and the contact state between the human and the robot. This paper presents a novel dataset of tactile and kinesthetic data obtained from a robot gripper that grabs a human forearm. The dataset is collected with a three-fingered gripper with two underactuated fingers and a fixed finger with a high-resolution tactile sensor. A palpation procedure is performed to record the shape of the forearm and to recognize the bones and muscles in different sections. Moreover, an application for the use of the database is included. In particular, a fusion approach is used to estimate the actual grasped forearm section using both kinesthetic and tactile information on a regression deep-learning neural network. First, tactile and kinesthetic data are trained separately with Long Short-Term Memory (LSTM) neural networks, considering the data are sequential. Then, the outputs are fed to a Fusion neural network to enhance the estimation. The experiments conducted show good results in training both sources separately, with superior performance when the fusion approach is considered. MDPI 2022-11-12 /pmc/articles/PMC9696784/ /pubmed/36433347 http://dx.doi.org/10.3390/s22228752 Text en © 2022 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Pastor, Francisco
Lin-Yang, Da-hui
Gómez-de-Gabriel, Jesús M.
García-Cerezo, Alfonso J.
Dataset with Tactile and Kinesthetic Information from a Human Forearm and Its Application to Deep Learning
title Dataset with Tactile and Kinesthetic Information from a Human Forearm and Its Application to Deep Learning
title_full Dataset with Tactile and Kinesthetic Information from a Human Forearm and Its Application to Deep Learning
title_fullStr Dataset with Tactile and Kinesthetic Information from a Human Forearm and Its Application to Deep Learning
title_full_unstemmed Dataset with Tactile and Kinesthetic Information from a Human Forearm and Its Application to Deep Learning
title_short Dataset with Tactile and Kinesthetic Information from a Human Forearm and Its Application to Deep Learning
title_sort dataset with tactile and kinesthetic information from a human forearm and its application to deep learning
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9696784/
https://www.ncbi.nlm.nih.gov/pubmed/36433347
http://dx.doi.org/10.3390/s22228752
work_keys_str_mv AT pastorfrancisco datasetwithtactileandkinestheticinformationfromahumanforearmanditsapplicationtodeeplearning
AT linyangdahui datasetwithtactileandkinestheticinformationfromahumanforearmanditsapplicationtodeeplearning
AT gomezdegabrieljesusm datasetwithtactileandkinestheticinformationfromahumanforearmanditsapplicationtodeeplearning
AT garciacerezoalfonsoj datasetwithtactileandkinestheticinformationfromahumanforearmanditsapplicationtodeeplearning