Cargando…
Interactive robot teaching based on finger trajectory using multimodal RGB-D-T-data
The concept of Industry 4.0 brings the change of industry manufacturing patterns that become more efficient and more flexible. In response to this tendency, an efficient robot teaching approach without complex programming has become a popular research direction. Therefore, we propose an interactive...
Autores principales: | , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Frontiers Media S.A.
2023
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10060539/ https://www.ncbi.nlm.nih.gov/pubmed/37008984 http://dx.doi.org/10.3389/frobt.2023.1120357 |
_version_ | 1785017113387204608 |
---|---|
author | Zhang, Yan Fütterer, Richard Notni, Gunther |
author_facet | Zhang, Yan Fütterer, Richard Notni, Gunther |
author_sort | Zhang, Yan |
collection | PubMed |
description | The concept of Industry 4.0 brings the change of industry manufacturing patterns that become more efficient and more flexible. In response to this tendency, an efficient robot teaching approach without complex programming has become a popular research direction. Therefore, we propose an interactive finger-touch based robot teaching schema using a multimodal 3D image (color (RGB), thermal (T) and point cloud (3D)) processing. Here, the resulting heat trace touching the object surface will be analyzed on multimodal data, in order to precisely identify the true hand/object contact points. These identified contact points are used to calculate the robot path directly. To optimize the identification of the contact points we propose a calculation scheme using a number of anchor points which are first predicted by hand/object point cloud segmentation. Subsequently a probability density function is defined to calculate the prior probability distribution of true finger trace. The temperature in the neighborhood of each anchor point is then dynamically analyzed to calculate the likelihood. Experiments show that the trajectories estimated by our multimodal method have significantly better accuracy and smoothness than only by analyzing point cloud and static temperature distribution. |
format | Online Article Text |
id | pubmed-10060539 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2023 |
publisher | Frontiers Media S.A. |
record_format | MEDLINE/PubMed |
spelling | pubmed-100605392023-03-31 Interactive robot teaching based on finger trajectory using multimodal RGB-D-T-data Zhang, Yan Fütterer, Richard Notni, Gunther Front Robot AI Robotics and AI The concept of Industry 4.0 brings the change of industry manufacturing patterns that become more efficient and more flexible. In response to this tendency, an efficient robot teaching approach without complex programming has become a popular research direction. Therefore, we propose an interactive finger-touch based robot teaching schema using a multimodal 3D image (color (RGB), thermal (T) and point cloud (3D)) processing. Here, the resulting heat trace touching the object surface will be analyzed on multimodal data, in order to precisely identify the true hand/object contact points. These identified contact points are used to calculate the robot path directly. To optimize the identification of the contact points we propose a calculation scheme using a number of anchor points which are first predicted by hand/object point cloud segmentation. Subsequently a probability density function is defined to calculate the prior probability distribution of true finger trace. The temperature in the neighborhood of each anchor point is then dynamically analyzed to calculate the likelihood. Experiments show that the trajectories estimated by our multimodal method have significantly better accuracy and smoothness than only by analyzing point cloud and static temperature distribution. Frontiers Media S.A. 2023-03-16 /pmc/articles/PMC10060539/ /pubmed/37008984 http://dx.doi.org/10.3389/frobt.2023.1120357 Text en Copyright © 2023 Zhang, Fütterer and Notni. https://creativecommons.org/licenses/by/4.0/This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms. |
spellingShingle | Robotics and AI Zhang, Yan Fütterer, Richard Notni, Gunther Interactive robot teaching based on finger trajectory using multimodal RGB-D-T-data |
title | Interactive robot teaching based on finger trajectory using multimodal RGB-D-T-data |
title_full | Interactive robot teaching based on finger trajectory using multimodal RGB-D-T-data |
title_fullStr | Interactive robot teaching based on finger trajectory using multimodal RGB-D-T-data |
title_full_unstemmed | Interactive robot teaching based on finger trajectory using multimodal RGB-D-T-data |
title_short | Interactive robot teaching based on finger trajectory using multimodal RGB-D-T-data |
title_sort | interactive robot teaching based on finger trajectory using multimodal rgb-d-t-data |
topic | Robotics and AI |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10060539/ https://www.ncbi.nlm.nih.gov/pubmed/37008984 http://dx.doi.org/10.3389/frobt.2023.1120357 |
work_keys_str_mv | AT zhangyan interactiverobotteachingbasedonfingertrajectoryusingmultimodalrgbdtdata AT futtererrichard interactiverobotteachingbasedonfingertrajectoryusingmultimodalrgbdtdata AT notnigunther interactiverobotteachingbasedonfingertrajectoryusingmultimodalrgbdtdata |