Cargando…

A novel brain-controlled wheelchair combined with computer vision and augmented reality

BACKGROUND: Brain-controlled wheelchairs (BCWs) are important applications of brain–computer interfaces (BCIs). Currently, most BCWs are semiautomatic. When users want to reach a target of interest in their immediate environment, this semiautomatic interaction strategy is slow. METHODS: To this end,...

Descripción completa

Detalles Bibliográficos
Autores principales: Liu, Kaixuan, Yu, Yang, Liu, Yadong, Tang, Jingsheng, Liang, Xinbin, Chu, Xingxing, Zhou, Zongtan
Formato: Online Artículo Texto
Lenguaje:English
Publicado: BioMed Central 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9327337/
https://www.ncbi.nlm.nih.gov/pubmed/35883092
http://dx.doi.org/10.1186/s12938-022-01020-8
_version_ 1784757486822096896
author Liu, Kaixuan
Yu, Yang
Liu, Yadong
Tang, Jingsheng
Liang, Xinbin
Chu, Xingxing
Zhou, Zongtan
author_facet Liu, Kaixuan
Yu, Yang
Liu, Yadong
Tang, Jingsheng
Liang, Xinbin
Chu, Xingxing
Zhou, Zongtan
author_sort Liu, Kaixuan
collection PubMed
description BACKGROUND: Brain-controlled wheelchairs (BCWs) are important applications of brain–computer interfaces (BCIs). Currently, most BCWs are semiautomatic. When users want to reach a target of interest in their immediate environment, this semiautomatic interaction strategy is slow. METHODS: To this end, we combined computer vision (CV) and augmented reality (AR) with a BCW and proposed the CVAR-BCW: a BCW with a novel automatic interaction strategy. The proposed CVAR-BCW uses a translucent head-mounted display (HMD) as the user interface, uses CV to automatically detect environments, and shows the detected targets through AR technology. Once a user has chosen a target, the CVAR-BCW can automatically navigate to it. For a few scenarios, the semiautomatic strategy might be useful. We integrated a semiautomatic interaction framework into the CVAR-BCW. The user can switch between the automatic and semiautomatic strategies. RESULTS: We recruited 20 non-disabled subjects for this study and used the accuracy, information transfer rate (ITR), and average time required for the CVAR-BCW to reach each designated target as performance metrics. The experimental results showed that our CVAR-BCW performed well in indoor environments: the average accuracies across all subjects were 83.6% (automatic) and 84.1% (semiautomatic), the average ITRs were 8.2 bits/min (automatic) and 8.3 bits/min (semiautomatic), the average times required to reach a target were 42.4 s (automatic) and 93.4 s (semiautomatic), and the average workloads and degrees of fatigue for the two strategies were both approximately 20. CONCLUSIONS: Our CVAR-BCW provides a user-centric interaction approach and a good framework for integrating more advanced artificial intelligence technologies, which may be useful in the field of disability assistance.
format Online
Article
Text
id pubmed-9327337
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher BioMed Central
record_format MEDLINE/PubMed
spelling pubmed-93273372022-07-28 A novel brain-controlled wheelchair combined with computer vision and augmented reality Liu, Kaixuan Yu, Yang Liu, Yadong Tang, Jingsheng Liang, Xinbin Chu, Xingxing Zhou, Zongtan Biomed Eng Online Research BACKGROUND: Brain-controlled wheelchairs (BCWs) are important applications of brain–computer interfaces (BCIs). Currently, most BCWs are semiautomatic. When users want to reach a target of interest in their immediate environment, this semiautomatic interaction strategy is slow. METHODS: To this end, we combined computer vision (CV) and augmented reality (AR) with a BCW and proposed the CVAR-BCW: a BCW with a novel automatic interaction strategy. The proposed CVAR-BCW uses a translucent head-mounted display (HMD) as the user interface, uses CV to automatically detect environments, and shows the detected targets through AR technology. Once a user has chosen a target, the CVAR-BCW can automatically navigate to it. For a few scenarios, the semiautomatic strategy might be useful. We integrated a semiautomatic interaction framework into the CVAR-BCW. The user can switch between the automatic and semiautomatic strategies. RESULTS: We recruited 20 non-disabled subjects for this study and used the accuracy, information transfer rate (ITR), and average time required for the CVAR-BCW to reach each designated target as performance metrics. The experimental results showed that our CVAR-BCW performed well in indoor environments: the average accuracies across all subjects were 83.6% (automatic) and 84.1% (semiautomatic), the average ITRs were 8.2 bits/min (automatic) and 8.3 bits/min (semiautomatic), the average times required to reach a target were 42.4 s (automatic) and 93.4 s (semiautomatic), and the average workloads and degrees of fatigue for the two strategies were both approximately 20. CONCLUSIONS: Our CVAR-BCW provides a user-centric interaction approach and a good framework for integrating more advanced artificial intelligence technologies, which may be useful in the field of disability assistance. BioMed Central 2022-07-26 /pmc/articles/PMC9327337/ /pubmed/35883092 http://dx.doi.org/10.1186/s12938-022-01020-8 Text en © The Author(s) 2022 https://creativecommons.org/licenses/by/4.0/Open AccessThis article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) . The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/ (https://creativecommons.org/publicdomain/zero/1.0/) ) applies to the data made available in this article, unless otherwise stated in a credit line to the data.
spellingShingle Research
Liu, Kaixuan
Yu, Yang
Liu, Yadong
Tang, Jingsheng
Liang, Xinbin
Chu, Xingxing
Zhou, Zongtan
A novel brain-controlled wheelchair combined with computer vision and augmented reality
title A novel brain-controlled wheelchair combined with computer vision and augmented reality
title_full A novel brain-controlled wheelchair combined with computer vision and augmented reality
title_fullStr A novel brain-controlled wheelchair combined with computer vision and augmented reality
title_full_unstemmed A novel brain-controlled wheelchair combined with computer vision and augmented reality
title_short A novel brain-controlled wheelchair combined with computer vision and augmented reality
title_sort novel brain-controlled wheelchair combined with computer vision and augmented reality
topic Research
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9327337/
https://www.ncbi.nlm.nih.gov/pubmed/35883092
http://dx.doi.org/10.1186/s12938-022-01020-8
work_keys_str_mv AT liukaixuan anovelbraincontrolledwheelchaircombinedwithcomputervisionandaugmentedreality
AT yuyang anovelbraincontrolledwheelchaircombinedwithcomputervisionandaugmentedreality
AT liuyadong anovelbraincontrolledwheelchaircombinedwithcomputervisionandaugmentedreality
AT tangjingsheng anovelbraincontrolledwheelchaircombinedwithcomputervisionandaugmentedreality
AT liangxinbin anovelbraincontrolledwheelchaircombinedwithcomputervisionandaugmentedreality
AT chuxingxing anovelbraincontrolledwheelchaircombinedwithcomputervisionandaugmentedreality
AT zhouzongtan anovelbraincontrolledwheelchaircombinedwithcomputervisionandaugmentedreality
AT liukaixuan novelbraincontrolledwheelchaircombinedwithcomputervisionandaugmentedreality
AT yuyang novelbraincontrolledwheelchaircombinedwithcomputervisionandaugmentedreality
AT liuyadong novelbraincontrolledwheelchaircombinedwithcomputervisionandaugmentedreality
AT tangjingsheng novelbraincontrolledwheelchaircombinedwithcomputervisionandaugmentedreality
AT liangxinbin novelbraincontrolledwheelchaircombinedwithcomputervisionandaugmentedreality
AT chuxingxing novelbraincontrolledwheelchaircombinedwithcomputervisionandaugmentedreality
AT zhouzongtan novelbraincontrolledwheelchaircombinedwithcomputervisionandaugmentedreality