Cargando…

Blending of brain-machine interface and vision-guided autonomous robotics improves neuroprosthetic arm performance during grasping

BACKGROUND: Recent studies have shown that brain-machine interfaces (BMIs) offer great potential for restoring upper limb function. However, grasping objects is a complicated task and the signals extracted from the brain may not always be capable of driving these movements reliably. Vision-guided ro...

Descripción completa

Detalles Bibliográficos
Autores principales: Downey, John E., Weiss, Jeffrey M., Muelling, Katharina, Venkatraman, Arun, Valois, Jean-Sebastien, Hebert, Martial, Bagnell, J. Andrew, Schwartz, Andrew B., Collinger, Jennifer L.
Formato: Online Artículo Texto
Lenguaje:English
Publicado: BioMed Central 2016
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4797113/
https://www.ncbi.nlm.nih.gov/pubmed/26987662
http://dx.doi.org/10.1186/s12984-016-0134-9
_version_ 1782421889461780480
author Downey, John E.
Weiss, Jeffrey M.
Muelling, Katharina
Venkatraman, Arun
Valois, Jean-Sebastien
Hebert, Martial
Bagnell, J. Andrew
Schwartz, Andrew B.
Collinger, Jennifer L.
author_facet Downey, John E.
Weiss, Jeffrey M.
Muelling, Katharina
Venkatraman, Arun
Valois, Jean-Sebastien
Hebert, Martial
Bagnell, J. Andrew
Schwartz, Andrew B.
Collinger, Jennifer L.
author_sort Downey, John E.
collection PubMed
description BACKGROUND: Recent studies have shown that brain-machine interfaces (BMIs) offer great potential for restoring upper limb function. However, grasping objects is a complicated task and the signals extracted from the brain may not always be capable of driving these movements reliably. Vision-guided robotic assistance is one possible way to improve BMI performance. We describe a method of shared control where the user controls a prosthetic arm using a BMI and receives assistance with positioning the hand when it approaches an object. METHODS: Two human subjects with tetraplegia used a robotic arm to complete object transport tasks with and without shared control. The shared control system was designed to provide a balance between BMI-derived intention and computer assistance. An autonomous robotic grasping system identified and tracked objects and defined stable grasp positions for these objects. The system identified when the user intended to interact with an object based on the BMI-controlled movements of the robotic arm. Using shared control, BMI controlled movements and autonomous grasping commands were blended to ensure secure grasps. RESULTS: Both subjects were more successful on object transfer tasks when using shared control compared to BMI control alone. Movements made using shared control were more accurate, more efficient, and less difficult. One participant attempted a task with multiple objects and successfully lifted one of two closely spaced objects in 92 % of trials, demonstrating the potential for users to accurately execute their intention while using shared control. CONCLUSIONS: Integration of BMI control with vision-guided robotic assistance led to improved performance on object transfer tasks. Providing assistance while maintaining generalizability will make BMI systems more attractive to potential users. TRIAL REGISTRATION: NCT01364480 and NCT01894802. ELECTRONIC SUPPLEMENTARY MATERIAL: The online version of this article (doi:10.1186/s12984-016-0134-9) contains supplementary material, which is available to authorized users.
format Online
Article
Text
id pubmed-4797113
institution National Center for Biotechnology Information
language English
publishDate 2016
publisher BioMed Central
record_format MEDLINE/PubMed
spelling pubmed-47971132016-03-18 Blending of brain-machine interface and vision-guided autonomous robotics improves neuroprosthetic arm performance during grasping Downey, John E. Weiss, Jeffrey M. Muelling, Katharina Venkatraman, Arun Valois, Jean-Sebastien Hebert, Martial Bagnell, J. Andrew Schwartz, Andrew B. Collinger, Jennifer L. J Neuroeng Rehabil Research BACKGROUND: Recent studies have shown that brain-machine interfaces (BMIs) offer great potential for restoring upper limb function. However, grasping objects is a complicated task and the signals extracted from the brain may not always be capable of driving these movements reliably. Vision-guided robotic assistance is one possible way to improve BMI performance. We describe a method of shared control where the user controls a prosthetic arm using a BMI and receives assistance with positioning the hand when it approaches an object. METHODS: Two human subjects with tetraplegia used a robotic arm to complete object transport tasks with and without shared control. The shared control system was designed to provide a balance between BMI-derived intention and computer assistance. An autonomous robotic grasping system identified and tracked objects and defined stable grasp positions for these objects. The system identified when the user intended to interact with an object based on the BMI-controlled movements of the robotic arm. Using shared control, BMI controlled movements and autonomous grasping commands were blended to ensure secure grasps. RESULTS: Both subjects were more successful on object transfer tasks when using shared control compared to BMI control alone. Movements made using shared control were more accurate, more efficient, and less difficult. One participant attempted a task with multiple objects and successfully lifted one of two closely spaced objects in 92 % of trials, demonstrating the potential for users to accurately execute their intention while using shared control. CONCLUSIONS: Integration of BMI control with vision-guided robotic assistance led to improved performance on object transfer tasks. Providing assistance while maintaining generalizability will make BMI systems more attractive to potential users. TRIAL REGISTRATION: NCT01364480 and NCT01894802. ELECTRONIC SUPPLEMENTARY MATERIAL: The online version of this article (doi:10.1186/s12984-016-0134-9) contains supplementary material, which is available to authorized users. BioMed Central 2016-03-18 /pmc/articles/PMC4797113/ /pubmed/26987662 http://dx.doi.org/10.1186/s12984-016-0134-9 Text en © Downey et al. 2016 Open AccessThis article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.
spellingShingle Research
Downey, John E.
Weiss, Jeffrey M.
Muelling, Katharina
Venkatraman, Arun
Valois, Jean-Sebastien
Hebert, Martial
Bagnell, J. Andrew
Schwartz, Andrew B.
Collinger, Jennifer L.
Blending of brain-machine interface and vision-guided autonomous robotics improves neuroprosthetic arm performance during grasping
title Blending of brain-machine interface and vision-guided autonomous robotics improves neuroprosthetic arm performance during grasping
title_full Blending of brain-machine interface and vision-guided autonomous robotics improves neuroprosthetic arm performance during grasping
title_fullStr Blending of brain-machine interface and vision-guided autonomous robotics improves neuroprosthetic arm performance during grasping
title_full_unstemmed Blending of brain-machine interface and vision-guided autonomous robotics improves neuroprosthetic arm performance during grasping
title_short Blending of brain-machine interface and vision-guided autonomous robotics improves neuroprosthetic arm performance during grasping
title_sort blending of brain-machine interface and vision-guided autonomous robotics improves neuroprosthetic arm performance during grasping
topic Research
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4797113/
https://www.ncbi.nlm.nih.gov/pubmed/26987662
http://dx.doi.org/10.1186/s12984-016-0134-9
work_keys_str_mv AT downeyjohne blendingofbrainmachineinterfaceandvisionguidedautonomousroboticsimprovesneuroprostheticarmperformanceduringgrasping
AT weissjeffreym blendingofbrainmachineinterfaceandvisionguidedautonomousroboticsimprovesneuroprostheticarmperformanceduringgrasping
AT muellingkatharina blendingofbrainmachineinterfaceandvisionguidedautonomousroboticsimprovesneuroprostheticarmperformanceduringgrasping
AT venkatramanarun blendingofbrainmachineinterfaceandvisionguidedautonomousroboticsimprovesneuroprostheticarmperformanceduringgrasping
AT valoisjeansebastien blendingofbrainmachineinterfaceandvisionguidedautonomousroboticsimprovesneuroprostheticarmperformanceduringgrasping
AT hebertmartial blendingofbrainmachineinterfaceandvisionguidedautonomousroboticsimprovesneuroprostheticarmperformanceduringgrasping
AT bagnelljandrew blendingofbrainmachineinterfaceandvisionguidedautonomousroboticsimprovesneuroprostheticarmperformanceduringgrasping
AT schwartzandrewb blendingofbrainmachineinterfaceandvisionguidedautonomousroboticsimprovesneuroprostheticarmperformanceduringgrasping
AT collingerjenniferl blendingofbrainmachineinterfaceandvisionguidedautonomousroboticsimprovesneuroprostheticarmperformanceduringgrasping