Cargando…

Virtual and Actual Humanoid Robot Control with Four-Class Motor-Imagery-Based Optical Brain-Computer Interface

Motor-imagery tasks are a popular input method for controlling brain-computer interfaces (BCIs), partially due to their similarities to naturally produced motor signals. The use of functional near-infrared spectroscopy (fNIRS) in BCIs is still emerging and has shown potential as a supplement or repl...

Descripción completa

Detalles Bibliográficos
Autores principales: Batula, Alyssa M., Kim, Youngmoo E., Ayaz, Hasan
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Hindawi 2017
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5539938/
https://www.ncbi.nlm.nih.gov/pubmed/28804712
http://dx.doi.org/10.1155/2017/1463512
_version_ 1783254569702129664
author Batula, Alyssa M.
Kim, Youngmoo E.
Ayaz, Hasan
author_facet Batula, Alyssa M.
Kim, Youngmoo E.
Ayaz, Hasan
author_sort Batula, Alyssa M.
collection PubMed
description Motor-imagery tasks are a popular input method for controlling brain-computer interfaces (BCIs), partially due to their similarities to naturally produced motor signals. The use of functional near-infrared spectroscopy (fNIRS) in BCIs is still emerging and has shown potential as a supplement or replacement for electroencephalography. However, studies often use only two or three motor-imagery tasks, limiting the number of available commands. In this work, we present the results of the first four-class motor-imagery-based online fNIRS-BCI for robot control. Thirteen participants utilized upper- and lower-limb motor-imagery tasks (left hand, right hand, left foot, and right foot) that were mapped to four high-level commands (turn left, turn right, move forward, and move backward) to control the navigation of a simulated or real robot. A significant improvement in classification accuracy was found between the virtual-robot-based BCI (control of a virtual robot) and the physical-robot BCI (control of the DARwIn-OP humanoid robot). Differences were also found in the oxygenated hemoglobin activation patterns of the four tasks between the first and second BCI. These results corroborate previous findings that motor imagery can be improved with feedback and imply that a four-class motor-imagery-based fNIRS-BCI could be feasible with sufficient subject training.
format Online
Article
Text
id pubmed-5539938
institution National Center for Biotechnology Information
language English
publishDate 2017
publisher Hindawi
record_format MEDLINE/PubMed
spelling pubmed-55399382017-08-13 Virtual and Actual Humanoid Robot Control with Four-Class Motor-Imagery-Based Optical Brain-Computer Interface Batula, Alyssa M. Kim, Youngmoo E. Ayaz, Hasan Biomed Res Int Research Article Motor-imagery tasks are a popular input method for controlling brain-computer interfaces (BCIs), partially due to their similarities to naturally produced motor signals. The use of functional near-infrared spectroscopy (fNIRS) in BCIs is still emerging and has shown potential as a supplement or replacement for electroencephalography. However, studies often use only two or three motor-imagery tasks, limiting the number of available commands. In this work, we present the results of the first four-class motor-imagery-based online fNIRS-BCI for robot control. Thirteen participants utilized upper- and lower-limb motor-imagery tasks (left hand, right hand, left foot, and right foot) that were mapped to four high-level commands (turn left, turn right, move forward, and move backward) to control the navigation of a simulated or real robot. A significant improvement in classification accuracy was found between the virtual-robot-based BCI (control of a virtual robot) and the physical-robot BCI (control of the DARwIn-OP humanoid robot). Differences were also found in the oxygenated hemoglobin activation patterns of the four tasks between the first and second BCI. These results corroborate previous findings that motor imagery can be improved with feedback and imply that a four-class motor-imagery-based fNIRS-BCI could be feasible with sufficient subject training. Hindawi 2017 2017-07-18 /pmc/articles/PMC5539938/ /pubmed/28804712 http://dx.doi.org/10.1155/2017/1463512 Text en Copyright © 2017 Alyssa M. Batula et al. https://creativecommons.org/licenses/by/4.0/ This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
spellingShingle Research Article
Batula, Alyssa M.
Kim, Youngmoo E.
Ayaz, Hasan
Virtual and Actual Humanoid Robot Control with Four-Class Motor-Imagery-Based Optical Brain-Computer Interface
title Virtual and Actual Humanoid Robot Control with Four-Class Motor-Imagery-Based Optical Brain-Computer Interface
title_full Virtual and Actual Humanoid Robot Control with Four-Class Motor-Imagery-Based Optical Brain-Computer Interface
title_fullStr Virtual and Actual Humanoid Robot Control with Four-Class Motor-Imagery-Based Optical Brain-Computer Interface
title_full_unstemmed Virtual and Actual Humanoid Robot Control with Four-Class Motor-Imagery-Based Optical Brain-Computer Interface
title_short Virtual and Actual Humanoid Robot Control with Four-Class Motor-Imagery-Based Optical Brain-Computer Interface
title_sort virtual and actual humanoid robot control with four-class motor-imagery-based optical brain-computer interface
topic Research Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5539938/
https://www.ncbi.nlm.nih.gov/pubmed/28804712
http://dx.doi.org/10.1155/2017/1463512
work_keys_str_mv AT batulaalyssam virtualandactualhumanoidrobotcontrolwithfourclassmotorimagerybasedopticalbraincomputerinterface
AT kimyoungmooe virtualandactualhumanoidrobotcontrolwithfourclassmotorimagerybasedopticalbraincomputerinterface
AT ayazhasan virtualandactualhumanoidrobotcontrolwithfourclassmotorimagerybasedopticalbraincomputerinterface