Cargando…

A Bayesian Deep Neural Network for Safe Visual Servoing in Human–Robot Interaction

Safety is an important issue in human–robot interaction (HRI) applications. Various research works have focused on different levels of safety in HRI. If a human/obstacle is detected, a repulsive action can be taken to avoid the collision. Common repulsive actions include distance methods, potential...

Descripción completa

Detalles Bibliográficos
Autores principales: Shi, Lei, Copot, Cosmin, Vanlanduit, Steve
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Frontiers Media S.A. 2021
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8247479/
https://www.ncbi.nlm.nih.gov/pubmed/34222355
http://dx.doi.org/10.3389/frobt.2021.687031
_version_ 1783716532676722688
author Shi, Lei
Copot, Cosmin
Vanlanduit, Steve
author_facet Shi, Lei
Copot, Cosmin
Vanlanduit, Steve
author_sort Shi, Lei
collection PubMed
description Safety is an important issue in human–robot interaction (HRI) applications. Various research works have focused on different levels of safety in HRI. If a human/obstacle is detected, a repulsive action can be taken to avoid the collision. Common repulsive actions include distance methods, potential field methods, and safety field methods. Approaches based on machine learning are less explored regarding the selection of the repulsive action. Few research works focus on the uncertainty of the data-based approaches and consider the efficiency of the executing task during collision avoidance. In this study, we describe a system that can avoid collision with human hands while the robot is executing an image-based visual servoing (IBVS) task. We use Monte Carlo dropout (MC dropout) to transform a deep neural network (DNN) to a Bayesian DNN, and learn the repulsive position for hand avoidance. The Bayesian DNN allows IBVS to converge faster than the opposite repulsive pose. Furthermore, it allows the robot to avoid undesired poses that the DNN cannot avoid. The experimental results show that Bayesian DNN has adequate accuracy and can generalize well on unseen data. The predictive interval coverage probability (PICP) of the predictions along x, y, and z directions are 0.84, 0.94, and 0.95, respectively. In the space which is unseen in the training data, the Bayesian DNN is also more robust than a DNN. We further implement the system on a UR10 robot, and test the robustness of the Bayesian DNN and the IBVS convergence speed. Results show that the Bayesian DNN can avoid the poses out of the reach range of the robot and it lets the IBVS task converge faster than the opposite repulsive pose.
format Online
Article
Text
id pubmed-8247479
institution National Center for Biotechnology Information
language English
publishDate 2021
publisher Frontiers Media S.A.
record_format MEDLINE/PubMed
spelling pubmed-82474792021-07-02 A Bayesian Deep Neural Network for Safe Visual Servoing in Human–Robot Interaction Shi, Lei Copot, Cosmin Vanlanduit, Steve Front Robot AI Robotics and AI Safety is an important issue in human–robot interaction (HRI) applications. Various research works have focused on different levels of safety in HRI. If a human/obstacle is detected, a repulsive action can be taken to avoid the collision. Common repulsive actions include distance methods, potential field methods, and safety field methods. Approaches based on machine learning are less explored regarding the selection of the repulsive action. Few research works focus on the uncertainty of the data-based approaches and consider the efficiency of the executing task during collision avoidance. In this study, we describe a system that can avoid collision with human hands while the robot is executing an image-based visual servoing (IBVS) task. We use Monte Carlo dropout (MC dropout) to transform a deep neural network (DNN) to a Bayesian DNN, and learn the repulsive position for hand avoidance. The Bayesian DNN allows IBVS to converge faster than the opposite repulsive pose. Furthermore, it allows the robot to avoid undesired poses that the DNN cannot avoid. The experimental results show that Bayesian DNN has adequate accuracy and can generalize well on unseen data. The predictive interval coverage probability (PICP) of the predictions along x, y, and z directions are 0.84, 0.94, and 0.95, respectively. In the space which is unseen in the training data, the Bayesian DNN is also more robust than a DNN. We further implement the system on a UR10 robot, and test the robustness of the Bayesian DNN and the IBVS convergence speed. Results show that the Bayesian DNN can avoid the poses out of the reach range of the robot and it lets the IBVS task converge faster than the opposite repulsive pose. Frontiers Media S.A. 2021-06-17 /pmc/articles/PMC8247479/ /pubmed/34222355 http://dx.doi.org/10.3389/frobt.2021.687031 Text en Copyright © 2021 Shi, Copot and Vanlanduit. https://creativecommons.org/licenses/by/4.0/This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
spellingShingle Robotics and AI
Shi, Lei
Copot, Cosmin
Vanlanduit, Steve
A Bayesian Deep Neural Network for Safe Visual Servoing in Human–Robot Interaction
title A Bayesian Deep Neural Network for Safe Visual Servoing in Human–Robot Interaction
title_full A Bayesian Deep Neural Network for Safe Visual Servoing in Human–Robot Interaction
title_fullStr A Bayesian Deep Neural Network for Safe Visual Servoing in Human–Robot Interaction
title_full_unstemmed A Bayesian Deep Neural Network for Safe Visual Servoing in Human–Robot Interaction
title_short A Bayesian Deep Neural Network for Safe Visual Servoing in Human–Robot Interaction
title_sort bayesian deep neural network for safe visual servoing in human–robot interaction
topic Robotics and AI
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8247479/
https://www.ncbi.nlm.nih.gov/pubmed/34222355
http://dx.doi.org/10.3389/frobt.2021.687031
work_keys_str_mv AT shilei abayesiandeepneuralnetworkforsafevisualservoinginhumanrobotinteraction
AT copotcosmin abayesiandeepneuralnetworkforsafevisualservoinginhumanrobotinteraction
AT vanlanduitsteve abayesiandeepneuralnetworkforsafevisualservoinginhumanrobotinteraction
AT shilei bayesiandeepneuralnetworkforsafevisualservoinginhumanrobotinteraction
AT copotcosmin bayesiandeepneuralnetworkforsafevisualservoinginhumanrobotinteraction
AT vanlanduitsteve bayesiandeepneuralnetworkforsafevisualservoinginhumanrobotinteraction