Cargando…

An EEG/EMG/EOG-Based Multimodal Human-Machine Interface to Real-Time Control of a Soft Robot Hand

Brain-computer interface (BCI) technology shows potential for application to motor rehabilitation therapies that use neural plasticity to restore motor function and improve quality of life of stroke survivors. However, it is often difficult for BCI systems to provide the variety of control commands...

Descripción completa

Detalles Bibliográficos
Autores principales: Zhang, Jinhua, Wang, Baozeng, Zhang, Cheng, Xiao, Yanqing, Wang, Michael Yu
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Frontiers Media S.A. 2019
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6449448/
https://www.ncbi.nlm.nih.gov/pubmed/30983986
http://dx.doi.org/10.3389/fnbot.2019.00007
_version_ 1783408848664526848
author Zhang, Jinhua
Wang, Baozeng
Zhang, Cheng
Xiao, Yanqing
Wang, Michael Yu
author_facet Zhang, Jinhua
Wang, Baozeng
Zhang, Cheng
Xiao, Yanqing
Wang, Michael Yu
author_sort Zhang, Jinhua
collection PubMed
description Brain-computer interface (BCI) technology shows potential for application to motor rehabilitation therapies that use neural plasticity to restore motor function and improve quality of life of stroke survivors. However, it is often difficult for BCI systems to provide the variety of control commands necessary for multi-task real-time control of soft robot naturally. In this study, a novel multimodal human-machine interface system (mHMI) is developed using combinations of electrooculography (EOG), electroencephalography (EEG), and electromyogram (EMG) to generate numerous control instructions. Moreover, we also explore subject acceptance of an affordable wearable soft robot to move basic hand actions during robot-assisted movement. Six healthy subjects separately perform left and right hand motor imagery, looking-left and looking-right eye movements, and different hand gestures in different modes to control a soft robot in a variety of actions. The results indicate that the number of mHMI control instructions is significantly greater than achievable with any individual mode. Furthermore, the mHMI can achieve an average classification accuracy of 93.83% with the average information transfer rate of 47.41 bits/min, which is entirely equivalent to a control speed of 17 actions per minute. The study is expected to construct a more user-friendly mHMI for real-time control of soft robot to help healthy or disabled persons perform basic hand movements in friendly and convenient way.
format Online
Article
Text
id pubmed-6449448
institution National Center for Biotechnology Information
language English
publishDate 2019
publisher Frontiers Media S.A.
record_format MEDLINE/PubMed
spelling pubmed-64494482019-04-12 An EEG/EMG/EOG-Based Multimodal Human-Machine Interface to Real-Time Control of a Soft Robot Hand Zhang, Jinhua Wang, Baozeng Zhang, Cheng Xiao, Yanqing Wang, Michael Yu Front Neurorobot Neuroscience Brain-computer interface (BCI) technology shows potential for application to motor rehabilitation therapies that use neural plasticity to restore motor function and improve quality of life of stroke survivors. However, it is often difficult for BCI systems to provide the variety of control commands necessary for multi-task real-time control of soft robot naturally. In this study, a novel multimodal human-machine interface system (mHMI) is developed using combinations of electrooculography (EOG), electroencephalography (EEG), and electromyogram (EMG) to generate numerous control instructions. Moreover, we also explore subject acceptance of an affordable wearable soft robot to move basic hand actions during robot-assisted movement. Six healthy subjects separately perform left and right hand motor imagery, looking-left and looking-right eye movements, and different hand gestures in different modes to control a soft robot in a variety of actions. The results indicate that the number of mHMI control instructions is significantly greater than achievable with any individual mode. Furthermore, the mHMI can achieve an average classification accuracy of 93.83% with the average information transfer rate of 47.41 bits/min, which is entirely equivalent to a control speed of 17 actions per minute. The study is expected to construct a more user-friendly mHMI for real-time control of soft robot to help healthy or disabled persons perform basic hand movements in friendly and convenient way. Frontiers Media S.A. 2019-03-29 /pmc/articles/PMC6449448/ /pubmed/30983986 http://dx.doi.org/10.3389/fnbot.2019.00007 Text en Copyright © 2019 Zhang, Wang, Zhang, Xiao and Wang. http://creativecommons.org/licenses/by/4.0/ This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
spellingShingle Neuroscience
Zhang, Jinhua
Wang, Baozeng
Zhang, Cheng
Xiao, Yanqing
Wang, Michael Yu
An EEG/EMG/EOG-Based Multimodal Human-Machine Interface to Real-Time Control of a Soft Robot Hand
title An EEG/EMG/EOG-Based Multimodal Human-Machine Interface to Real-Time Control of a Soft Robot Hand
title_full An EEG/EMG/EOG-Based Multimodal Human-Machine Interface to Real-Time Control of a Soft Robot Hand
title_fullStr An EEG/EMG/EOG-Based Multimodal Human-Machine Interface to Real-Time Control of a Soft Robot Hand
title_full_unstemmed An EEG/EMG/EOG-Based Multimodal Human-Machine Interface to Real-Time Control of a Soft Robot Hand
title_short An EEG/EMG/EOG-Based Multimodal Human-Machine Interface to Real-Time Control of a Soft Robot Hand
title_sort eeg/emg/eog-based multimodal human-machine interface to real-time control of a soft robot hand
topic Neuroscience
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6449448/
https://www.ncbi.nlm.nih.gov/pubmed/30983986
http://dx.doi.org/10.3389/fnbot.2019.00007
work_keys_str_mv AT zhangjinhua aneegemgeogbasedmultimodalhumanmachineinterfacetorealtimecontrolofasoftrobothand
AT wangbaozeng aneegemgeogbasedmultimodalhumanmachineinterfacetorealtimecontrolofasoftrobothand
AT zhangcheng aneegemgeogbasedmultimodalhumanmachineinterfacetorealtimecontrolofasoftrobothand
AT xiaoyanqing aneegemgeogbasedmultimodalhumanmachineinterfacetorealtimecontrolofasoftrobothand
AT wangmichaelyu aneegemgeogbasedmultimodalhumanmachineinterfacetorealtimecontrolofasoftrobothand
AT zhangjinhua eegemgeogbasedmultimodalhumanmachineinterfacetorealtimecontrolofasoftrobothand
AT wangbaozeng eegemgeogbasedmultimodalhumanmachineinterfacetorealtimecontrolofasoftrobothand
AT zhangcheng eegemgeogbasedmultimodalhumanmachineinterfacetorealtimecontrolofasoftrobothand
AT xiaoyanqing eegemgeogbasedmultimodalhumanmachineinterfacetorealtimecontrolofasoftrobothand
AT wangmichaelyu eegemgeogbasedmultimodalhumanmachineinterfacetorealtimecontrolofasoftrobothand