Cargando…
Real-Time Navigation in Google Street View((®)) Using a Motor Imagery-Based BCI
Navigation in virtual worlds is ubiquitous in games and other virtual reality (VR) applications and mainly relies on external controllers. As brain–computer interfaces (BCI)s rely on mental control, bypassing traditional neural pathways, they provide to paralyzed users an alternative way to navigate...
Autores principales: | , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
MDPI
2023
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9921617/ https://www.ncbi.nlm.nih.gov/pubmed/36772744 http://dx.doi.org/10.3390/s23031704 |
_version_ | 1784887354694041600 |
---|---|
author | Yang, Liuyin Van Hulle, Marc M. |
author_facet | Yang, Liuyin Van Hulle, Marc M. |
author_sort | Yang, Liuyin |
collection | PubMed |
description | Navigation in virtual worlds is ubiquitous in games and other virtual reality (VR) applications and mainly relies on external controllers. As brain–computer interfaces (BCI)s rely on mental control, bypassing traditional neural pathways, they provide to paralyzed users an alternative way to navigate. However, the majority of BCI-based navigation studies adopt cue-based visual paradigms, and the evoked brain responses are encoded into navigation commands. Although robust and accurate, these paradigms are less intuitive and comfortable for navigation compared to imagining limb movements (motor imagery, MI). However, decoding motor imagery from EEG activity is notoriously challenging. Typically, wet electrodes are used to improve EEG signal quality, including a large number of them to discriminate between movements of different limbs, and a cuedbased paradigm is used instead of a self-paced one to maximize decoding performance. Motor BCI applications primarily focus on typing applications or on navigating a wheelchair—the latter raises safety concerns—thereby calling for sensors scanning the environment for obstacles and potentially hazardous scenarios. With the help of new technologies such as virtual reality (VR), vivid graphics can be rendered, providing the user with a safe and immersive experience; and they could be used for navigation purposes, a topic that has yet to be fully explored in the BCI community. In this study, we propose a novel MI-BCI application based on an 8-dry-electrode EEG setup, with which users can explore and navigate in Google Street View((®)). We pay attention to system design to address the lower performance of the MI decoder due to the dry electrodes’ lower signal quality and the small number of electrodes. Specifically, we restricted the number of navigation commands by using a novel middle-level control scheme and avoided decoder mistakes by introducing eye blinks as a control signal in different navigation stages. Both offline and online experiments were conducted with 20 healthy subjects. The results showed acceptable performance, even given the limitations of the EEG set-up, which we attribute to the design of the BCI application. The study suggests the use of MI-BCI in future games and VR applications for consumers and patients temporarily or permanently devoid of muscle control. |
format | Online Article Text |
id | pubmed-9921617 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2023 |
publisher | MDPI |
record_format | MEDLINE/PubMed |
spelling | pubmed-99216172023-02-12 Real-Time Navigation in Google Street View((®)) Using a Motor Imagery-Based BCI Yang, Liuyin Van Hulle, Marc M. Sensors (Basel) Article Navigation in virtual worlds is ubiquitous in games and other virtual reality (VR) applications and mainly relies on external controllers. As brain–computer interfaces (BCI)s rely on mental control, bypassing traditional neural pathways, they provide to paralyzed users an alternative way to navigate. However, the majority of BCI-based navigation studies adopt cue-based visual paradigms, and the evoked brain responses are encoded into navigation commands. Although robust and accurate, these paradigms are less intuitive and comfortable for navigation compared to imagining limb movements (motor imagery, MI). However, decoding motor imagery from EEG activity is notoriously challenging. Typically, wet electrodes are used to improve EEG signal quality, including a large number of them to discriminate between movements of different limbs, and a cuedbased paradigm is used instead of a self-paced one to maximize decoding performance. Motor BCI applications primarily focus on typing applications or on navigating a wheelchair—the latter raises safety concerns—thereby calling for sensors scanning the environment for obstacles and potentially hazardous scenarios. With the help of new technologies such as virtual reality (VR), vivid graphics can be rendered, providing the user with a safe and immersive experience; and they could be used for navigation purposes, a topic that has yet to be fully explored in the BCI community. In this study, we propose a novel MI-BCI application based on an 8-dry-electrode EEG setup, with which users can explore and navigate in Google Street View((®)). We pay attention to system design to address the lower performance of the MI decoder due to the dry electrodes’ lower signal quality and the small number of electrodes. Specifically, we restricted the number of navigation commands by using a novel middle-level control scheme and avoided decoder mistakes by introducing eye blinks as a control signal in different navigation stages. Both offline and online experiments were conducted with 20 healthy subjects. The results showed acceptable performance, even given the limitations of the EEG set-up, which we attribute to the design of the BCI application. The study suggests the use of MI-BCI in future games and VR applications for consumers and patients temporarily or permanently devoid of muscle control. MDPI 2023-02-03 /pmc/articles/PMC9921617/ /pubmed/36772744 http://dx.doi.org/10.3390/s23031704 Text en © 2023 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). |
spellingShingle | Article Yang, Liuyin Van Hulle, Marc M. Real-Time Navigation in Google Street View((®)) Using a Motor Imagery-Based BCI |
title | Real-Time Navigation in Google Street View((®)) Using a Motor Imagery-Based BCI |
title_full | Real-Time Navigation in Google Street View((®)) Using a Motor Imagery-Based BCI |
title_fullStr | Real-Time Navigation in Google Street View((®)) Using a Motor Imagery-Based BCI |
title_full_unstemmed | Real-Time Navigation in Google Street View((®)) Using a Motor Imagery-Based BCI |
title_short | Real-Time Navigation in Google Street View((®)) Using a Motor Imagery-Based BCI |
title_sort | real-time navigation in google street view((®)) using a motor imagery-based bci |
topic | Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9921617/ https://www.ncbi.nlm.nih.gov/pubmed/36772744 http://dx.doi.org/10.3390/s23031704 |
work_keys_str_mv | AT yangliuyin realtimenavigationingooglestreetviewusingamotorimagerybasedbci AT vanhullemarcm realtimenavigationingooglestreetviewusingamotorimagerybasedbci |