Cargando…

Vision and RTLS Safety Implementation in an Experimental Human—Robot Collaboration Scenario

Human–robot collaboration is becoming ever more widespread in industry because of its adaptability. Conventional safety elements are used when converting a workplace into a collaborative one, although new technologies are becoming more widespread. This work proposes a safe robotic workplace that can...

Descripción completa

Detalles Bibliográficos
Autores principales: Slovák, Juraj, Melicher, Markus, Šimovec, Matej, Vachálek, Ján
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2021
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8037017/
https://www.ncbi.nlm.nih.gov/pubmed/33915798
http://dx.doi.org/10.3390/s21072419
_version_ 1783677045382840320
author Slovák, Juraj
Melicher, Markus
Šimovec, Matej
Vachálek, Ján
author_facet Slovák, Juraj
Melicher, Markus
Šimovec, Matej
Vachálek, Ján
author_sort Slovák, Juraj
collection PubMed
description Human–robot collaboration is becoming ever more widespread in industry because of its adaptability. Conventional safety elements are used when converting a workplace into a collaborative one, although new technologies are becoming more widespread. This work proposes a safe robotic workplace that can adapt its operation and speed depending on the surrounding stimuli. The benefit lies in its use of promising technologies that combine safety and collaboration. Using a depth camera operating on the passive stereo principle, safety zones are created around the robotic workplace, while objects moving around the workplace are identified, including their distance from the robotic system. Passive stereo employs two colour streams that enable distance computation based on pixel shift. The colour stream is also used in the human identification process. Human identification is achieved using the Histogram of Oriented Gradients, pre-learned precisely for this purpose. The workplace also features autonomous trolleys for material supply. Unequivocal trolley identification is achieved using a real-time location system through tags placed on each trolley. The robotic workplace’s speed and the halting of its work depend on the positions of objects within safety zones. The entry of a trolley with an exception to a safety zone does not affect the workplace speed. This work simulates individual scenarios that may occur at a robotic workplace with an emphasis on compliance with safety measures. The novelty lies in the integration of a real-time location system into a vision-based safety system, which are not new technologies by themselves, but their interconnection to achieve exception handling in order to reduce downtimes in the collaborative robotic system is innovative.
format Online
Article
Text
id pubmed-8037017
institution National Center for Biotechnology Information
language English
publishDate 2021
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-80370172021-04-12 Vision and RTLS Safety Implementation in an Experimental Human—Robot Collaboration Scenario Slovák, Juraj Melicher, Markus Šimovec, Matej Vachálek, Ján Sensors (Basel) Article Human–robot collaboration is becoming ever more widespread in industry because of its adaptability. Conventional safety elements are used when converting a workplace into a collaborative one, although new technologies are becoming more widespread. This work proposes a safe robotic workplace that can adapt its operation and speed depending on the surrounding stimuli. The benefit lies in its use of promising technologies that combine safety and collaboration. Using a depth camera operating on the passive stereo principle, safety zones are created around the robotic workplace, while objects moving around the workplace are identified, including their distance from the robotic system. Passive stereo employs two colour streams that enable distance computation based on pixel shift. The colour stream is also used in the human identification process. Human identification is achieved using the Histogram of Oriented Gradients, pre-learned precisely for this purpose. The workplace also features autonomous trolleys for material supply. Unequivocal trolley identification is achieved using a real-time location system through tags placed on each trolley. The robotic workplace’s speed and the halting of its work depend on the positions of objects within safety zones. The entry of a trolley with an exception to a safety zone does not affect the workplace speed. This work simulates individual scenarios that may occur at a robotic workplace with an emphasis on compliance with safety measures. The novelty lies in the integration of a real-time location system into a vision-based safety system, which are not new technologies by themselves, but their interconnection to achieve exception handling in order to reduce downtimes in the collaborative robotic system is innovative. MDPI 2021-04-01 /pmc/articles/PMC8037017/ /pubmed/33915798 http://dx.doi.org/10.3390/s21072419 Text en © 2021 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Slovák, Juraj
Melicher, Markus
Šimovec, Matej
Vachálek, Ján
Vision and RTLS Safety Implementation in an Experimental Human—Robot Collaboration Scenario
title Vision and RTLS Safety Implementation in an Experimental Human—Robot Collaboration Scenario
title_full Vision and RTLS Safety Implementation in an Experimental Human—Robot Collaboration Scenario
title_fullStr Vision and RTLS Safety Implementation in an Experimental Human—Robot Collaboration Scenario
title_full_unstemmed Vision and RTLS Safety Implementation in an Experimental Human—Robot Collaboration Scenario
title_short Vision and RTLS Safety Implementation in an Experimental Human—Robot Collaboration Scenario
title_sort vision and rtls safety implementation in an experimental human—robot collaboration scenario
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8037017/
https://www.ncbi.nlm.nih.gov/pubmed/33915798
http://dx.doi.org/10.3390/s21072419
work_keys_str_mv AT slovakjuraj visionandrtlssafetyimplementationinanexperimentalhumanrobotcollaborationscenario
AT melichermarkus visionandrtlssafetyimplementationinanexperimentalhumanrobotcollaborationscenario
AT simovecmatej visionandrtlssafetyimplementationinanexperimentalhumanrobotcollaborationscenario
AT vachalekjan visionandrtlssafetyimplementationinanexperimentalhumanrobotcollaborationscenario