Cargando…
Semi-Autonomous Robotic Arm Reaching With Hybrid Gaze–Brain Machine Interface
Recent developments in the non-muscular human–robot interface (HRI) and shared control strategies have shown potential for controlling the assistive robotic arm by people with no residual movement or muscular activity in upper limbs. However, most non-muscular HRIs only produce discrete-valued comma...
Autores principales: | , , , , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Frontiers Media S.A.
2020
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6992643/ https://www.ncbi.nlm.nih.gov/pubmed/32038219 http://dx.doi.org/10.3389/fnbot.2019.00111 |
_version_ | 1783492873157607424 |
---|---|
author | Zeng, Hong Shen, Yitao Hu, Xuhui Song, Aiguo Xu, Baoguo Li, Huijun Wang, Yanxin Wen, Pengcheng |
author_facet | Zeng, Hong Shen, Yitao Hu, Xuhui Song, Aiguo Xu, Baoguo Li, Huijun Wang, Yanxin Wen, Pengcheng |
author_sort | Zeng, Hong |
collection | PubMed |
description | Recent developments in the non-muscular human–robot interface (HRI) and shared control strategies have shown potential for controlling the assistive robotic arm by people with no residual movement or muscular activity in upper limbs. However, most non-muscular HRIs only produce discrete-valued commands, resulting in non-intuitive and less effective control of the dexterous assistive robotic arm. Furthermore, the user commands and the robot autonomy commands usually switch in the shared control strategies of such applications. This characteristic has been found to yield a reduced sense of agency as well as frustration for the user according to previous user studies. In this study, we firstly propose an intuitive and easy-to-learn-and-use hybrid HRI by combing the Brain–machine interface (BMI) and the gaze-tracking interface. For the proposed hybrid gaze-BMI, the continuous modulation of the movement speed via the motor intention occurs seamlessly and simultaneously to the unconstrained movement direction control with the gaze signals. We then propose a shared control paradigm that always combines user input and the autonomy with the dynamic combination regulation. The proposed hybrid gaze-BMI and shared control paradigm were validated for a robotic arm reaching task performed with healthy subjects. All the users were able to employ the hybrid gaze-BMI for moving the end-effector sequentially to reach the target across the horizontal plane while also avoiding collisions with obstacles. The shared control paradigm maintained as much volitional control as possible, while providing the assistance for the most difficult parts of the task. The presented semi-autonomous robotic system yielded continuous, smooth, and collision-free motion trajectories for the end effector approaching the target. Compared to a system without assistances from robot autonomy, it significantly reduces the rate of failure as well as the time and effort spent by the user to complete the tasks. |
format | Online Article Text |
id | pubmed-6992643 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2020 |
publisher | Frontiers Media S.A. |
record_format | MEDLINE/PubMed |
spelling | pubmed-69926432020-02-07 Semi-Autonomous Robotic Arm Reaching With Hybrid Gaze–Brain Machine Interface Zeng, Hong Shen, Yitao Hu, Xuhui Song, Aiguo Xu, Baoguo Li, Huijun Wang, Yanxin Wen, Pengcheng Front Neurorobot Neuroscience Recent developments in the non-muscular human–robot interface (HRI) and shared control strategies have shown potential for controlling the assistive robotic arm by people with no residual movement or muscular activity in upper limbs. However, most non-muscular HRIs only produce discrete-valued commands, resulting in non-intuitive and less effective control of the dexterous assistive robotic arm. Furthermore, the user commands and the robot autonomy commands usually switch in the shared control strategies of such applications. This characteristic has been found to yield a reduced sense of agency as well as frustration for the user according to previous user studies. In this study, we firstly propose an intuitive and easy-to-learn-and-use hybrid HRI by combing the Brain–machine interface (BMI) and the gaze-tracking interface. For the proposed hybrid gaze-BMI, the continuous modulation of the movement speed via the motor intention occurs seamlessly and simultaneously to the unconstrained movement direction control with the gaze signals. We then propose a shared control paradigm that always combines user input and the autonomy with the dynamic combination regulation. The proposed hybrid gaze-BMI and shared control paradigm were validated for a robotic arm reaching task performed with healthy subjects. All the users were able to employ the hybrid gaze-BMI for moving the end-effector sequentially to reach the target across the horizontal plane while also avoiding collisions with obstacles. The shared control paradigm maintained as much volitional control as possible, while providing the assistance for the most difficult parts of the task. The presented semi-autonomous robotic system yielded continuous, smooth, and collision-free motion trajectories for the end effector approaching the target. Compared to a system without assistances from robot autonomy, it significantly reduces the rate of failure as well as the time and effort spent by the user to complete the tasks. Frontiers Media S.A. 2020-01-24 /pmc/articles/PMC6992643/ /pubmed/32038219 http://dx.doi.org/10.3389/fnbot.2019.00111 Text en Copyright © 2020 Zeng, Shen, Hu, Song, Xu, Li, Wang and Wen. http://creativecommons.org/licenses/by/4.0/ This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms. |
spellingShingle | Neuroscience Zeng, Hong Shen, Yitao Hu, Xuhui Song, Aiguo Xu, Baoguo Li, Huijun Wang, Yanxin Wen, Pengcheng Semi-Autonomous Robotic Arm Reaching With Hybrid Gaze–Brain Machine Interface |
title | Semi-Autonomous Robotic Arm Reaching With Hybrid Gaze–Brain Machine Interface |
title_full | Semi-Autonomous Robotic Arm Reaching With Hybrid Gaze–Brain Machine Interface |
title_fullStr | Semi-Autonomous Robotic Arm Reaching With Hybrid Gaze–Brain Machine Interface |
title_full_unstemmed | Semi-Autonomous Robotic Arm Reaching With Hybrid Gaze–Brain Machine Interface |
title_short | Semi-Autonomous Robotic Arm Reaching With Hybrid Gaze–Brain Machine Interface |
title_sort | semi-autonomous robotic arm reaching with hybrid gaze–brain machine interface |
topic | Neuroscience |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6992643/ https://www.ncbi.nlm.nih.gov/pubmed/32038219 http://dx.doi.org/10.3389/fnbot.2019.00111 |
work_keys_str_mv | AT zenghong semiautonomousroboticarmreachingwithhybridgazebrainmachineinterface AT shenyitao semiautonomousroboticarmreachingwithhybridgazebrainmachineinterface AT huxuhui semiautonomousroboticarmreachingwithhybridgazebrainmachineinterface AT songaiguo semiautonomousroboticarmreachingwithhybridgazebrainmachineinterface AT xubaoguo semiautonomousroboticarmreachingwithhybridgazebrainmachineinterface AT lihuijun semiautonomousroboticarmreachingwithhybridgazebrainmachineinterface AT wangyanxin semiautonomousroboticarmreachingwithhybridgazebrainmachineinterface AT wenpengcheng semiautonomousroboticarmreachingwithhybridgazebrainmachineinterface |