Cargando…
A 36-Class Bimodal ERP Brain-Computer Interface Using Location-Congruent Auditory-Tactile Stimuli
To date, traditional visual-based event-related potential brain-computer interface (ERP-BCI) systems continue to dominate the mainstream BCI research. However, these conventional BCIs are unsuitable for the individuals who have partly or completely lost their vision. Considering the poor performance...
Autores principales: | , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
MDPI
2020
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7464701/ https://www.ncbi.nlm.nih.gov/pubmed/32781712 http://dx.doi.org/10.3390/brainsci10080524 |
_version_ | 1783577423898476544 |
---|---|
author | Zhang, Boyang Zhou, Zongtan Jiang, Jing |
author_facet | Zhang, Boyang Zhou, Zongtan Jiang, Jing |
author_sort | Zhang, Boyang |
collection | PubMed |
description | To date, traditional visual-based event-related potential brain-computer interface (ERP-BCI) systems continue to dominate the mainstream BCI research. However, these conventional BCIs are unsuitable for the individuals who have partly or completely lost their vision. Considering the poor performance of gaze independent ERP-BCIs, it is necessary to study techniques to improve the performance of these BCI systems. In this paper, we developed a novel 36-class bimodal ERP-BCI system based on tactile and auditory stimuli, in which six-virtual-direction audio files produced via head related transfer functions (HRTF) were delivered through headphones and location-congruent electro-tactile stimuli were simultaneously delivered to the corresponding position using electrodes placed on the abdomen and waist. We selected the eight best channels, trained a Bayesian linear discriminant analysis (BLDA) classifier and acquired the optimal trial number for target selection in online process. The average online information transfer rate (ITR) of the bimodal ERP-BCI reached 11.66 bit/min, improvements of 35.11% and 36.69% compared to the auditory (8.63 bit/min) and tactile approaches (8.53 bit/min), respectively. The results demonstrate the performance of the bimodal system is superior to each unimodal system. These facts indicate that the proposed bimodal system has potential utility as a gaze-independent BCI in future real-world applications. |
format | Online Article Text |
id | pubmed-7464701 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2020 |
publisher | MDPI |
record_format | MEDLINE/PubMed |
spelling | pubmed-74647012020-09-04 A 36-Class Bimodal ERP Brain-Computer Interface Using Location-Congruent Auditory-Tactile Stimuli Zhang, Boyang Zhou, Zongtan Jiang, Jing Brain Sci Article To date, traditional visual-based event-related potential brain-computer interface (ERP-BCI) systems continue to dominate the mainstream BCI research. However, these conventional BCIs are unsuitable for the individuals who have partly or completely lost their vision. Considering the poor performance of gaze independent ERP-BCIs, it is necessary to study techniques to improve the performance of these BCI systems. In this paper, we developed a novel 36-class bimodal ERP-BCI system based on tactile and auditory stimuli, in which six-virtual-direction audio files produced via head related transfer functions (HRTF) were delivered through headphones and location-congruent electro-tactile stimuli were simultaneously delivered to the corresponding position using electrodes placed on the abdomen and waist. We selected the eight best channels, trained a Bayesian linear discriminant analysis (BLDA) classifier and acquired the optimal trial number for target selection in online process. The average online information transfer rate (ITR) of the bimodal ERP-BCI reached 11.66 bit/min, improvements of 35.11% and 36.69% compared to the auditory (8.63 bit/min) and tactile approaches (8.53 bit/min), respectively. The results demonstrate the performance of the bimodal system is superior to each unimodal system. These facts indicate that the proposed bimodal system has potential utility as a gaze-independent BCI in future real-world applications. MDPI 2020-08-06 /pmc/articles/PMC7464701/ /pubmed/32781712 http://dx.doi.org/10.3390/brainsci10080524 Text en © 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/). |
spellingShingle | Article Zhang, Boyang Zhou, Zongtan Jiang, Jing A 36-Class Bimodal ERP Brain-Computer Interface Using Location-Congruent Auditory-Tactile Stimuli |
title | A 36-Class Bimodal ERP Brain-Computer Interface Using Location-Congruent Auditory-Tactile Stimuli |
title_full | A 36-Class Bimodal ERP Brain-Computer Interface Using Location-Congruent Auditory-Tactile Stimuli |
title_fullStr | A 36-Class Bimodal ERP Brain-Computer Interface Using Location-Congruent Auditory-Tactile Stimuli |
title_full_unstemmed | A 36-Class Bimodal ERP Brain-Computer Interface Using Location-Congruent Auditory-Tactile Stimuli |
title_short | A 36-Class Bimodal ERP Brain-Computer Interface Using Location-Congruent Auditory-Tactile Stimuli |
title_sort | 36-class bimodal erp brain-computer interface using location-congruent auditory-tactile stimuli |
topic | Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7464701/ https://www.ncbi.nlm.nih.gov/pubmed/32781712 http://dx.doi.org/10.3390/brainsci10080524 |
work_keys_str_mv | AT zhangboyang a36classbimodalerpbraincomputerinterfaceusinglocationcongruentauditorytactilestimuli AT zhouzongtan a36classbimodalerpbraincomputerinterfaceusinglocationcongruentauditorytactilestimuli AT jiangjing a36classbimodalerpbraincomputerinterfaceusinglocationcongruentauditorytactilestimuli AT zhangboyang 36classbimodalerpbraincomputerinterfaceusinglocationcongruentauditorytactilestimuli AT zhouzongtan 36classbimodalerpbraincomputerinterfaceusinglocationcongruentauditorytactilestimuli AT jiangjing 36classbimodalerpbraincomputerinterfaceusinglocationcongruentauditorytactilestimuli |