Cargando…

Estimating the Orientation of Objects from Tactile Sensing Data Using Machine Learning Methods and Visual Frames of Reference

Underactuated hands are useful tools for robotic in-hand manipulation tasks due to their capability to seamlessly adapt to unknown objects. To enable robots using such hands to achieve and maintain stable grasping conditions even under external disturbances while keeping track of an in-hand object’s...

Descripción completa

Detalles Bibliográficos
Autores principales: Prado da Fonseca, Vinicius, Alves de Oliveira, Thiago Eustaquio, Petriu, Emil M.
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2019
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6567179/
https://www.ncbi.nlm.nih.gov/pubmed/31108951
http://dx.doi.org/10.3390/s19102285
_version_ 1783427016907816960
author Prado da Fonseca, Vinicius
Alves de Oliveira, Thiago Eustaquio
Petriu, Emil M.
author_facet Prado da Fonseca, Vinicius
Alves de Oliveira, Thiago Eustaquio
Petriu, Emil M.
author_sort Prado da Fonseca, Vinicius
collection PubMed
description Underactuated hands are useful tools for robotic in-hand manipulation tasks due to their capability to seamlessly adapt to unknown objects. To enable robots using such hands to achieve and maintain stable grasping conditions even under external disturbances while keeping track of an in-hand object’s state requires learning object-tactile sensing data relationships. The human somatosensory system combines visual and tactile sensing information in their “What and Where” subsystem to achieve high levels of manipulation skills. The present paper proposes an approach for estimating the pose of in-hand objects combining tactile sensing data and visual frames of reference like the human “What and Where” subsystem. The system proposed here uses machine learning methods to estimate the orientation of in-hand objects from the data gathered by tactile sensors mounted on the phalanges of underactuated fingers. While tactile sensing provides local information about objects during in-hand manipulation, a vision system generates egocentric and allocentric frames of reference. A dual fuzzy logic controller was developed to achieve and sustain stable grasping conditions autonomously while forces were applied to in-hand objects to expose the system to different object configurations. Two sets of experiments were used to explore the system capabilities. On the first set, external forces changed the orientation of objects while the fuzzy controller kept objects in-hand for tactile and visual data collection for five machine learning estimators. Among these estimators, the ridge regressor achieved an average mean squared error of [Formula: see text]. On the second set of experiments, one of the underactuated fingers performed open-loop object rotations and data recorded were supplied to the same set of estimators. In this scenario, the Multilayer perceptron (MLP) neural network achieved the lowest mean squared error of [Formula: see text].
format Online
Article
Text
id pubmed-6567179
institution National Center for Biotechnology Information
language English
publishDate 2019
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-65671792019-06-17 Estimating the Orientation of Objects from Tactile Sensing Data Using Machine Learning Methods and Visual Frames of Reference Prado da Fonseca, Vinicius Alves de Oliveira, Thiago Eustaquio Petriu, Emil M. Sensors (Basel) Article Underactuated hands are useful tools for robotic in-hand manipulation tasks due to their capability to seamlessly adapt to unknown objects. To enable robots using such hands to achieve and maintain stable grasping conditions even under external disturbances while keeping track of an in-hand object’s state requires learning object-tactile sensing data relationships. The human somatosensory system combines visual and tactile sensing information in their “What and Where” subsystem to achieve high levels of manipulation skills. The present paper proposes an approach for estimating the pose of in-hand objects combining tactile sensing data and visual frames of reference like the human “What and Where” subsystem. The system proposed here uses machine learning methods to estimate the orientation of in-hand objects from the data gathered by tactile sensors mounted on the phalanges of underactuated fingers. While tactile sensing provides local information about objects during in-hand manipulation, a vision system generates egocentric and allocentric frames of reference. A dual fuzzy logic controller was developed to achieve and sustain stable grasping conditions autonomously while forces were applied to in-hand objects to expose the system to different object configurations. Two sets of experiments were used to explore the system capabilities. On the first set, external forces changed the orientation of objects while the fuzzy controller kept objects in-hand for tactile and visual data collection for five machine learning estimators. Among these estimators, the ridge regressor achieved an average mean squared error of [Formula: see text]. On the second set of experiments, one of the underactuated fingers performed open-loop object rotations and data recorded were supplied to the same set of estimators. In this scenario, the Multilayer perceptron (MLP) neural network achieved the lowest mean squared error of [Formula: see text]. MDPI 2019-05-17 /pmc/articles/PMC6567179/ /pubmed/31108951 http://dx.doi.org/10.3390/s19102285 Text en © 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Prado da Fonseca, Vinicius
Alves de Oliveira, Thiago Eustaquio
Petriu, Emil M.
Estimating the Orientation of Objects from Tactile Sensing Data Using Machine Learning Methods and Visual Frames of Reference
title Estimating the Orientation of Objects from Tactile Sensing Data Using Machine Learning Methods and Visual Frames of Reference
title_full Estimating the Orientation of Objects from Tactile Sensing Data Using Machine Learning Methods and Visual Frames of Reference
title_fullStr Estimating the Orientation of Objects from Tactile Sensing Data Using Machine Learning Methods and Visual Frames of Reference
title_full_unstemmed Estimating the Orientation of Objects from Tactile Sensing Data Using Machine Learning Methods and Visual Frames of Reference
title_short Estimating the Orientation of Objects from Tactile Sensing Data Using Machine Learning Methods and Visual Frames of Reference
title_sort estimating the orientation of objects from tactile sensing data using machine learning methods and visual frames of reference
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6567179/
https://www.ncbi.nlm.nih.gov/pubmed/31108951
http://dx.doi.org/10.3390/s19102285
work_keys_str_mv AT pradodafonsecavinicius estimatingtheorientationofobjectsfromtactilesensingdatausingmachinelearningmethodsandvisualframesofreference
AT alvesdeoliveirathiagoeustaquio estimatingtheorientationofobjectsfromtactilesensingdatausingmachinelearningmethodsandvisualframesofreference
AT petriuemilm estimatingtheorientationofobjectsfromtactilesensingdatausingmachinelearningmethodsandvisualframesofreference