Cargando…

Sensor Fusion-Based Anthropomorphic Control of a Robotic Arm

The main goal of this research is to develop a highly advanced anthropomorphic control system utilizing multiple sensor technologies to achieve precise control of a robotic arm. Combining Kinect and IMU sensors, together with a data glove, we aim to create a multimodal sensor system for capturing ri...

Descripción completa

Detalles Bibliográficos
Autores principales: Chen, Furong, Wang, Feilong, Dong, Yanling, Yong, Qi, Yang, Xiaolong, Zheng, Long, Gao, Yi, Su, Hang
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10669049/
https://www.ncbi.nlm.nih.gov/pubmed/38002367
http://dx.doi.org/10.3390/bioengineering10111243
_version_ 1785139604137967616
author Chen, Furong
Wang, Feilong
Dong, Yanling
Yong, Qi
Yang, Xiaolong
Zheng, Long
Gao, Yi
Su, Hang
author_facet Chen, Furong
Wang, Feilong
Dong, Yanling
Yong, Qi
Yang, Xiaolong
Zheng, Long
Gao, Yi
Su, Hang
author_sort Chen, Furong
collection PubMed
description The main goal of this research is to develop a highly advanced anthropomorphic control system utilizing multiple sensor technologies to achieve precise control of a robotic arm. Combining Kinect and IMU sensors, together with a data glove, we aim to create a multimodal sensor system for capturing rich information of human upper body movements. Specifically, the four angles of upper limb joints are collected using the Kinect sensor and IMU sensor. In order to improve the accuracy and stability of motion tracking, we use the Kalman filter method to fuse the Kinect and IMU data. In addition, we introduce data glove technology to collect the angle information of the wrist and fingers in seven different directions. The integration and fusion of multiple sensors provides us with full control over the robotic arm, giving it flexibility with 11 degrees of freedom. We successfully achieved a variety of anthropomorphic movements, including shoulder flexion, abduction, rotation, elbow flexion, and fine movements of the wrist and fingers. Most importantly, our experimental results demonstrate that the anthropomorphic control system we developed is highly accurate, real-time, and operable. In summary, the contribution of this study lies in the creation of a multimodal sensor system capable of capturing and precisely controlling human upper limb movements, which provides a solid foundation for the future development of anthropomorphic control technologies. This technology has a wide range of application prospects and can be used for rehabilitation in the medical field, robot collaboration in industrial automation, and immersive experience in virtual reality environments.
format Online
Article
Text
id pubmed-10669049
institution National Center for Biotechnology Information
language English
publishDate 2023
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-106690492023-10-24 Sensor Fusion-Based Anthropomorphic Control of a Robotic Arm Chen, Furong Wang, Feilong Dong, Yanling Yong, Qi Yang, Xiaolong Zheng, Long Gao, Yi Su, Hang Bioengineering (Basel) Article The main goal of this research is to develop a highly advanced anthropomorphic control system utilizing multiple sensor technologies to achieve precise control of a robotic arm. Combining Kinect and IMU sensors, together with a data glove, we aim to create a multimodal sensor system for capturing rich information of human upper body movements. Specifically, the four angles of upper limb joints are collected using the Kinect sensor and IMU sensor. In order to improve the accuracy and stability of motion tracking, we use the Kalman filter method to fuse the Kinect and IMU data. In addition, we introduce data glove technology to collect the angle information of the wrist and fingers in seven different directions. The integration and fusion of multiple sensors provides us with full control over the robotic arm, giving it flexibility with 11 degrees of freedom. We successfully achieved a variety of anthropomorphic movements, including shoulder flexion, abduction, rotation, elbow flexion, and fine movements of the wrist and fingers. Most importantly, our experimental results demonstrate that the anthropomorphic control system we developed is highly accurate, real-time, and operable. In summary, the contribution of this study lies in the creation of a multimodal sensor system capable of capturing and precisely controlling human upper limb movements, which provides a solid foundation for the future development of anthropomorphic control technologies. This technology has a wide range of application prospects and can be used for rehabilitation in the medical field, robot collaboration in industrial automation, and immersive experience in virtual reality environments. MDPI 2023-10-24 /pmc/articles/PMC10669049/ /pubmed/38002367 http://dx.doi.org/10.3390/bioengineering10111243 Text en © 2023 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Chen, Furong
Wang, Feilong
Dong, Yanling
Yong, Qi
Yang, Xiaolong
Zheng, Long
Gao, Yi
Su, Hang
Sensor Fusion-Based Anthropomorphic Control of a Robotic Arm
title Sensor Fusion-Based Anthropomorphic Control of a Robotic Arm
title_full Sensor Fusion-Based Anthropomorphic Control of a Robotic Arm
title_fullStr Sensor Fusion-Based Anthropomorphic Control of a Robotic Arm
title_full_unstemmed Sensor Fusion-Based Anthropomorphic Control of a Robotic Arm
title_short Sensor Fusion-Based Anthropomorphic Control of a Robotic Arm
title_sort sensor fusion-based anthropomorphic control of a robotic arm
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10669049/
https://www.ncbi.nlm.nih.gov/pubmed/38002367
http://dx.doi.org/10.3390/bioengineering10111243
work_keys_str_mv AT chenfurong sensorfusionbasedanthropomorphiccontrolofaroboticarm
AT wangfeilong sensorfusionbasedanthropomorphiccontrolofaroboticarm
AT dongyanling sensorfusionbasedanthropomorphiccontrolofaroboticarm
AT yongqi sensorfusionbasedanthropomorphiccontrolofaroboticarm
AT yangxiaolong sensorfusionbasedanthropomorphiccontrolofaroboticarm
AT zhenglong sensorfusionbasedanthropomorphiccontrolofaroboticarm
AT gaoyi sensorfusionbasedanthropomorphiccontrolofaroboticarm
AT suhang sensorfusionbasedanthropomorphiccontrolofaroboticarm