Cargando…

Mix Frame Visual Servo Control Framework for Autonomous Assistive Robotic Arms

Assistive robotic arms (ARAs) that provide care to the elderly and people with disabilities, are a significant part of Human-Robot Interaction (HRI). Presently available ARAs provide non-intuitive interfaces such as joysticks for control and thus, lacks the autonomy to perform daily activities. This...

Descripción completa

Detalles Bibliográficos
Autores principales: Arif, Zubair, Fu, Yili
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8778787/
https://www.ncbi.nlm.nih.gov/pubmed/35062597
http://dx.doi.org/10.3390/s22020642
_version_ 1784637408846807040
author Arif, Zubair
Fu, Yili
author_facet Arif, Zubair
Fu, Yili
author_sort Arif, Zubair
collection PubMed
description Assistive robotic arms (ARAs) that provide care to the elderly and people with disabilities, are a significant part of Human-Robot Interaction (HRI). Presently available ARAs provide non-intuitive interfaces such as joysticks for control and thus, lacks the autonomy to perform daily activities. This study proposes that, for inducing autonomous behavior in ARAs, visual sensors integration is vital, and visual servoing in the direct Cartesian control mode is the preferred method. Generally, ARAs are designed in a configuration where its end-effector’s position is defined in the fixed base frame while orientation is expressed in the end-effector frame. We denoted this configuration as ‘mixed frame robotic arms’. Consequently, conventional visual servo controllers which operate in a single frame of reference are incompatible with mixed frame ARAs. Therefore, we propose a mixed-frame visual servo control framework for ARAs. Moreover, we enlightened the task space kinematics of a mixed frame ARAs, which led us to the development of a novel “mixed frame Jacobian matrix”. The proposed framework was validated on a mixed frame JACO-2 7 DoF ARA using an adaptive proportional derivative controller for achieving image-based visual servoing (IBVS), which showed a significant increase of 31% in the convergence rate, outperforming conventional IBVS joint controllers, especially in the outstretched arm positions and near the base frame. Our Results determine the need for the mixed frame controller for deploying visual servo control on modern ARAs, that can inherently cater to the robotic arm’s joint limits, singularities, and self-collision problems.
format Online
Article
Text
id pubmed-8778787
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-87787872022-01-22 Mix Frame Visual Servo Control Framework for Autonomous Assistive Robotic Arms Arif, Zubair Fu, Yili Sensors (Basel) Article Assistive robotic arms (ARAs) that provide care to the elderly and people with disabilities, are a significant part of Human-Robot Interaction (HRI). Presently available ARAs provide non-intuitive interfaces such as joysticks for control and thus, lacks the autonomy to perform daily activities. This study proposes that, for inducing autonomous behavior in ARAs, visual sensors integration is vital, and visual servoing in the direct Cartesian control mode is the preferred method. Generally, ARAs are designed in a configuration where its end-effector’s position is defined in the fixed base frame while orientation is expressed in the end-effector frame. We denoted this configuration as ‘mixed frame robotic arms’. Consequently, conventional visual servo controllers which operate in a single frame of reference are incompatible with mixed frame ARAs. Therefore, we propose a mixed-frame visual servo control framework for ARAs. Moreover, we enlightened the task space kinematics of a mixed frame ARAs, which led us to the development of a novel “mixed frame Jacobian matrix”. The proposed framework was validated on a mixed frame JACO-2 7 DoF ARA using an adaptive proportional derivative controller for achieving image-based visual servoing (IBVS), which showed a significant increase of 31% in the convergence rate, outperforming conventional IBVS joint controllers, especially in the outstretched arm positions and near the base frame. Our Results determine the need for the mixed frame controller for deploying visual servo control on modern ARAs, that can inherently cater to the robotic arm’s joint limits, singularities, and self-collision problems. MDPI 2022-01-14 /pmc/articles/PMC8778787/ /pubmed/35062597 http://dx.doi.org/10.3390/s22020642 Text en © 2022 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Arif, Zubair
Fu, Yili
Mix Frame Visual Servo Control Framework for Autonomous Assistive Robotic Arms
title Mix Frame Visual Servo Control Framework for Autonomous Assistive Robotic Arms
title_full Mix Frame Visual Servo Control Framework for Autonomous Assistive Robotic Arms
title_fullStr Mix Frame Visual Servo Control Framework for Autonomous Assistive Robotic Arms
title_full_unstemmed Mix Frame Visual Servo Control Framework for Autonomous Assistive Robotic Arms
title_short Mix Frame Visual Servo Control Framework for Autonomous Assistive Robotic Arms
title_sort mix frame visual servo control framework for autonomous assistive robotic arms
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8778787/
https://www.ncbi.nlm.nih.gov/pubmed/35062597
http://dx.doi.org/10.3390/s22020642
work_keys_str_mv AT arifzubair mixframevisualservocontrolframeworkforautonomousassistiveroboticarms
AT fuyili mixframevisualservocontrolframeworkforautonomousassistiveroboticarms