Cargando…
Vision-aided brain–machine interface training system for robotic arm control and clinical application on two patients with cervical spinal cord injury
BACKGROUND: While spontaneous robotic arm control using motor imagery has been reported, most previous successful cases have used invasive approaches with advantages in spatial resolution. However, still many researchers continue to investigate methods for robotic arm control with noninvasive neural...
Autores principales: | , , , , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
BioMed Central
2019
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6371594/ https://www.ncbi.nlm.nih.gov/pubmed/30744661 http://dx.doi.org/10.1186/s12938-019-0633-6 |
_version_ | 1783394586696089600 |
---|---|
author | Kim, Yoon Jae Nam, Hyung Seok Lee, Woo Hyung Seo, Han Gil Leigh, Ja-Ho Oh, Byung-Mo Bang, Moon Suk Kim, Sungwan |
author_facet | Kim, Yoon Jae Nam, Hyung Seok Lee, Woo Hyung Seo, Han Gil Leigh, Ja-Ho Oh, Byung-Mo Bang, Moon Suk Kim, Sungwan |
author_sort | Kim, Yoon Jae |
collection | PubMed |
description | BACKGROUND: While spontaneous robotic arm control using motor imagery has been reported, most previous successful cases have used invasive approaches with advantages in spatial resolution. However, still many researchers continue to investigate methods for robotic arm control with noninvasive neural signal. Most of noninvasive control of robotic arm utilizes P300, steady state visually evoked potential, N2pc, and mental tasks differentiation. Even though these approaches demonstrated successful accuracy, they are limited in time efficiency and user intuition, and mostly require visual stimulation. Ultimately, velocity vector construction using electroencephalography activated by motion-related motor imagery can be considered as a substitution. In this study, a vision-aided brain–machine interface training system for robotic arm control is proposed and developed. METHODS: The proposed system uses a Microsoft Kinect to detect and estimates the 3D positions of the possible target objects. The predicted velocity vector for robot arm input is compensated using the artificial potential to follow an intended one among the possible targets. Two participants with cervical spinal cord injury trained with the system to explore its possible effects. RESULTS: In a situation with four possible targets, the proposed system significantly improved the distance error to the intended target compared to the unintended ones (p < 0.0001). Functional magnetic resonance imaging after five sessions of observation-based training with the developed system showed brain activation patterns with tendency of focusing to ipsilateral primary motor and sensory cortex, posterior parietal cortex, and contralateral cerebellum. However, shared control with blending parameter α less than 1 was not successful and success rate for touching an instructed target was less than the chance level (= 50%). CONCLUSIONS: The pilot clinical study utilizing the training system suggested potential beneficial effects in characterizing the brain activation patterns. |
format | Online Article Text |
id | pubmed-6371594 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2019 |
publisher | BioMed Central |
record_format | MEDLINE/PubMed |
spelling | pubmed-63715942019-02-25 Vision-aided brain–machine interface training system for robotic arm control and clinical application on two patients with cervical spinal cord injury Kim, Yoon Jae Nam, Hyung Seok Lee, Woo Hyung Seo, Han Gil Leigh, Ja-Ho Oh, Byung-Mo Bang, Moon Suk Kim, Sungwan Biomed Eng Online Research BACKGROUND: While spontaneous robotic arm control using motor imagery has been reported, most previous successful cases have used invasive approaches with advantages in spatial resolution. However, still many researchers continue to investigate methods for robotic arm control with noninvasive neural signal. Most of noninvasive control of robotic arm utilizes P300, steady state visually evoked potential, N2pc, and mental tasks differentiation. Even though these approaches demonstrated successful accuracy, they are limited in time efficiency and user intuition, and mostly require visual stimulation. Ultimately, velocity vector construction using electroencephalography activated by motion-related motor imagery can be considered as a substitution. In this study, a vision-aided brain–machine interface training system for robotic arm control is proposed and developed. METHODS: The proposed system uses a Microsoft Kinect to detect and estimates the 3D positions of the possible target objects. The predicted velocity vector for robot arm input is compensated using the artificial potential to follow an intended one among the possible targets. Two participants with cervical spinal cord injury trained with the system to explore its possible effects. RESULTS: In a situation with four possible targets, the proposed system significantly improved the distance error to the intended target compared to the unintended ones (p < 0.0001). Functional magnetic resonance imaging after five sessions of observation-based training with the developed system showed brain activation patterns with tendency of focusing to ipsilateral primary motor and sensory cortex, posterior parietal cortex, and contralateral cerebellum. However, shared control with blending parameter α less than 1 was not successful and success rate for touching an instructed target was less than the chance level (= 50%). CONCLUSIONS: The pilot clinical study utilizing the training system suggested potential beneficial effects in characterizing the brain activation patterns. BioMed Central 2019-02-11 /pmc/articles/PMC6371594/ /pubmed/30744661 http://dx.doi.org/10.1186/s12938-019-0633-6 Text en © The Author(s) 2019 Open AccessThis article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated. |
spellingShingle | Research Kim, Yoon Jae Nam, Hyung Seok Lee, Woo Hyung Seo, Han Gil Leigh, Ja-Ho Oh, Byung-Mo Bang, Moon Suk Kim, Sungwan Vision-aided brain–machine interface training system for robotic arm control and clinical application on two patients with cervical spinal cord injury |
title | Vision-aided brain–machine interface training system for robotic arm control and clinical application on two patients with cervical spinal cord injury |
title_full | Vision-aided brain–machine interface training system for robotic arm control and clinical application on two patients with cervical spinal cord injury |
title_fullStr | Vision-aided brain–machine interface training system for robotic arm control and clinical application on two patients with cervical spinal cord injury |
title_full_unstemmed | Vision-aided brain–machine interface training system for robotic arm control and clinical application on two patients with cervical spinal cord injury |
title_short | Vision-aided brain–machine interface training system for robotic arm control and clinical application on two patients with cervical spinal cord injury |
title_sort | vision-aided brain–machine interface training system for robotic arm control and clinical application on two patients with cervical spinal cord injury |
topic | Research |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6371594/ https://www.ncbi.nlm.nih.gov/pubmed/30744661 http://dx.doi.org/10.1186/s12938-019-0633-6 |
work_keys_str_mv | AT kimyoonjae visionaidedbrainmachineinterfacetrainingsystemforroboticarmcontrolandclinicalapplicationontwopatientswithcervicalspinalcordinjury AT namhyungseok visionaidedbrainmachineinterfacetrainingsystemforroboticarmcontrolandclinicalapplicationontwopatientswithcervicalspinalcordinjury AT leewoohyung visionaidedbrainmachineinterfacetrainingsystemforroboticarmcontrolandclinicalapplicationontwopatientswithcervicalspinalcordinjury AT seohangil visionaidedbrainmachineinterfacetrainingsystemforroboticarmcontrolandclinicalapplicationontwopatientswithcervicalspinalcordinjury AT leighjaho visionaidedbrainmachineinterfacetrainingsystemforroboticarmcontrolandclinicalapplicationontwopatientswithcervicalspinalcordinjury AT ohbyungmo visionaidedbrainmachineinterfacetrainingsystemforroboticarmcontrolandclinicalapplicationontwopatientswithcervicalspinalcordinjury AT bangmoonsuk visionaidedbrainmachineinterfacetrainingsystemforroboticarmcontrolandclinicalapplicationontwopatientswithcervicalspinalcordinjury AT kimsungwan visionaidedbrainmachineinterfacetrainingsystemforroboticarmcontrolandclinicalapplicationontwopatientswithcervicalspinalcordinjury |