Cargando…

Towards BCI-actuated smart wheelchair system

BACKGROUND: Electroencephalogram-based brain–computer interfaces (BCIs) represent novel human machine interactive technology that allows people to communicate and interact with the external world without relying on their peripheral muscles and nervous system. Among BCI systems, brain-actuated wheelc...

Descripción completa

Detalles Bibliográficos
Autores principales: Tang, Jingsheng, Liu, Yadong, Hu, Dewen, Zhou, ZongTan
Formato: Online Artículo Texto
Lenguaje:English
Publicado: BioMed Central 2018
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6102906/
https://www.ncbi.nlm.nih.gov/pubmed/30126416
http://dx.doi.org/10.1186/s12938-018-0545-x
_version_ 1783349265615028224
author Tang, Jingsheng
Liu, Yadong
Hu, Dewen
Zhou, ZongTan
author_facet Tang, Jingsheng
Liu, Yadong
Hu, Dewen
Zhou, ZongTan
author_sort Tang, Jingsheng
collection PubMed
description BACKGROUND: Electroencephalogram-based brain–computer interfaces (BCIs) represent novel human machine interactive technology that allows people to communicate and interact with the external world without relying on their peripheral muscles and nervous system. Among BCI systems, brain-actuated wheelchairs are promising systems for the rehabilitation of severely motor disabled individuals who are unable to control a wheelchair by conventional interfaces. Previous related studies realized the easy use of brain-actuated wheelchairs that enable people to navigate the wheelchair through simple commands; however, these systems rely on offline calibration of the environment. Other systems do not rely on any prior knowledge; however, the control of the system is time consuming. In this paper, we have proposed an improved mobile platform structure equipped with an omnidirectional wheelchair, a lightweight robotic arm, a target recognition module and an auto-control module. Based on the you only look once (YOLO) algorithm, our system can, in real time, recognize and locate the targets in the environment, and the users confirm one target through a P300-based BCI. An expert system plans a proper solution for a specific target; for example, the planned solution for a door is opening the door and then passing through it, and the auto-control system then jointly controls the wheelchair and robotic arm to complete the operation. During the task execution, the target is also tracked by using an image tracking technique. Thus, we have formed an easy-to-use system that can provide accurate services to satisfy user requirements, and this system can accommodate different environments. RESULTS: To validate and evaluate our system, an experiment simulating the daily application was performed. The tasks included the user driving the system closer to a walking man and having a conversation with him; going to another room through a door; and picking up a bottle of water on the desk and drinking water. Three patients (cerebral infarction; spinal injury; and stroke) and four healthy subjects participated in the test and all completed the tasks. CONCLUSION: This article presents a brain-actuated smart wheelchair system. The system is intelligent in that it provides efficient and considerate services for users. To test the system, three patients and four healthy subjects were recruited to participate in a test. The results demonstrate that the system works smartly and efficiently; with this system, users only need to issue small commands to get considerate services. This system is of significance for accelerating the application of BCIs in the practical environment, especially for patients who will use a BCI for rehabilitation applications.
format Online
Article
Text
id pubmed-6102906
institution National Center for Biotechnology Information
language English
publishDate 2018
publisher BioMed Central
record_format MEDLINE/PubMed
spelling pubmed-61029062018-08-30 Towards BCI-actuated smart wheelchair system Tang, Jingsheng Liu, Yadong Hu, Dewen Zhou, ZongTan Biomed Eng Online Research BACKGROUND: Electroencephalogram-based brain–computer interfaces (BCIs) represent novel human machine interactive technology that allows people to communicate and interact with the external world without relying on their peripheral muscles and nervous system. Among BCI systems, brain-actuated wheelchairs are promising systems for the rehabilitation of severely motor disabled individuals who are unable to control a wheelchair by conventional interfaces. Previous related studies realized the easy use of brain-actuated wheelchairs that enable people to navigate the wheelchair through simple commands; however, these systems rely on offline calibration of the environment. Other systems do not rely on any prior knowledge; however, the control of the system is time consuming. In this paper, we have proposed an improved mobile platform structure equipped with an omnidirectional wheelchair, a lightweight robotic arm, a target recognition module and an auto-control module. Based on the you only look once (YOLO) algorithm, our system can, in real time, recognize and locate the targets in the environment, and the users confirm one target through a P300-based BCI. An expert system plans a proper solution for a specific target; for example, the planned solution for a door is opening the door and then passing through it, and the auto-control system then jointly controls the wheelchair and robotic arm to complete the operation. During the task execution, the target is also tracked by using an image tracking technique. Thus, we have formed an easy-to-use system that can provide accurate services to satisfy user requirements, and this system can accommodate different environments. RESULTS: To validate and evaluate our system, an experiment simulating the daily application was performed. The tasks included the user driving the system closer to a walking man and having a conversation with him; going to another room through a door; and picking up a bottle of water on the desk and drinking water. Three patients (cerebral infarction; spinal injury; and stroke) and four healthy subjects participated in the test and all completed the tasks. CONCLUSION: This article presents a brain-actuated smart wheelchair system. The system is intelligent in that it provides efficient and considerate services for users. To test the system, three patients and four healthy subjects were recruited to participate in a test. The results demonstrate that the system works smartly and efficiently; with this system, users only need to issue small commands to get considerate services. This system is of significance for accelerating the application of BCIs in the practical environment, especially for patients who will use a BCI for rehabilitation applications. BioMed Central 2018-08-20 /pmc/articles/PMC6102906/ /pubmed/30126416 http://dx.doi.org/10.1186/s12938-018-0545-x Text en © The Author(s) 2018 Open AccessThis article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.
spellingShingle Research
Tang, Jingsheng
Liu, Yadong
Hu, Dewen
Zhou, ZongTan
Towards BCI-actuated smart wheelchair system
title Towards BCI-actuated smart wheelchair system
title_full Towards BCI-actuated smart wheelchair system
title_fullStr Towards BCI-actuated smart wheelchair system
title_full_unstemmed Towards BCI-actuated smart wheelchair system
title_short Towards BCI-actuated smart wheelchair system
title_sort towards bci-actuated smart wheelchair system
topic Research
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6102906/
https://www.ncbi.nlm.nih.gov/pubmed/30126416
http://dx.doi.org/10.1186/s12938-018-0545-x
work_keys_str_mv AT tangjingsheng towardsbciactuatedsmartwheelchairsystem
AT liuyadong towardsbciactuatedsmartwheelchairsystem
AT hudewen towardsbciactuatedsmartwheelchairsystem
AT zhouzongtan towardsbciactuatedsmartwheelchairsystem