Cargando…

See, Hear, or Feel – to Speak: A Versatile Multiple-Choice Functional Near-Infrared Spectroscopy-Brain-Computer Interface Feasible With Visual, Auditory, or Tactile Instructions

Severely motor-disabled patients, such as those suffering from the so-called “locked-in” syndrome, cannot communicate naturally. They may benefit from brain-computer interfaces (BCIs) exploiting brain signals for communication and therewith circumventing the muscular system. One BCI technique that h...

Descripción completa

Detalles Bibliográficos
Autores principales: Nagels-Coune, Laurien, Riecke, Lars, Benitez-Andonegui, Amaia, Klinkhammer, Simona, Goebel, Rainer, De Weerd, Peter, Lührs, Michael, Sorger, Bettina
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Frontiers Media S.A. 2021
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8656940/
https://www.ncbi.nlm.nih.gov/pubmed/34899223
http://dx.doi.org/10.3389/fnhum.2021.784522
_version_ 1784612396107563008
author Nagels-Coune, Laurien
Riecke, Lars
Benitez-Andonegui, Amaia
Klinkhammer, Simona
Goebel, Rainer
De Weerd, Peter
Lührs, Michael
Sorger, Bettina
author_facet Nagels-Coune, Laurien
Riecke, Lars
Benitez-Andonegui, Amaia
Klinkhammer, Simona
Goebel, Rainer
De Weerd, Peter
Lührs, Michael
Sorger, Bettina
author_sort Nagels-Coune, Laurien
collection PubMed
description Severely motor-disabled patients, such as those suffering from the so-called “locked-in” syndrome, cannot communicate naturally. They may benefit from brain-computer interfaces (BCIs) exploiting brain signals for communication and therewith circumventing the muscular system. One BCI technique that has gained attention recently is functional near-infrared spectroscopy (fNIRS). Typically, fNIRS-based BCIs allow for brain-based communication via voluntarily modulation of brain activity through mental task performance guided by visual or auditory instructions. While the development of fNIRS-BCIs has made great progress, the reliability of fNIRS-BCIs across time and environments has rarely been assessed. In the present fNIRS-BCI study, we tested six healthy participants across three consecutive days using a straightforward four-choice fNIRS-BCI communication paradigm that allows answer encoding based on instructions using various sensory modalities. To encode an answer, participants performed a motor imagery task (mental drawing) in one out of four time periods. Answer encoding was guided by either the visual, auditory, or tactile sensory modality. Two participants were tested outside the laboratory in a cafeteria. Answers were decoded from the time course of the most-informative fNIRS channel-by-chromophore combination. Across the three testing days, we obtained mean single- and multi-trial (joint analysis of four consecutive trials) accuracies of 62.5 and 85.19%, respectively. Obtained multi-trial accuracies were 86.11% for visual, 80.56% for auditory, and 88.89% for tactile sensory encoding. The two participants that used the fNIRS-BCI in a cafeteria obtained the best single- (72.22 and 77.78%) and multi-trial accuracies (100 and 94.44%). Communication was reliable over the three recording sessions with multi-trial accuracies of 86.11% on day 1, 86.11% on day 2, and 83.33% on day 3. To gauge the trade-off between number of optodes and decoding accuracy, averaging across two and three promising fNIRS channels was compared to the one-channel approach. Multi-trial accuracy increased from 85.19% (one-channel approach) to 91.67% (two-/three-channel approach). In sum, the presented fNIRS-BCI yielded robust decoding results using three alternative sensory encoding modalities. Further, fNIRS-BCI communication was stable over the course of three consecutive days, even in a natural (social) environment. Therewith, the developed fNIRS-BCI demonstrated high flexibility, reliability and robustness, crucial requirements for future clinical applicability.
format Online
Article
Text
id pubmed-8656940
institution National Center for Biotechnology Information
language English
publishDate 2021
publisher Frontiers Media S.A.
record_format MEDLINE/PubMed
spelling pubmed-86569402021-12-10 See, Hear, or Feel – to Speak: A Versatile Multiple-Choice Functional Near-Infrared Spectroscopy-Brain-Computer Interface Feasible With Visual, Auditory, or Tactile Instructions Nagels-Coune, Laurien Riecke, Lars Benitez-Andonegui, Amaia Klinkhammer, Simona Goebel, Rainer De Weerd, Peter Lührs, Michael Sorger, Bettina Front Hum Neurosci Neuroscience Severely motor-disabled patients, such as those suffering from the so-called “locked-in” syndrome, cannot communicate naturally. They may benefit from brain-computer interfaces (BCIs) exploiting brain signals for communication and therewith circumventing the muscular system. One BCI technique that has gained attention recently is functional near-infrared spectroscopy (fNIRS). Typically, fNIRS-based BCIs allow for brain-based communication via voluntarily modulation of brain activity through mental task performance guided by visual or auditory instructions. While the development of fNIRS-BCIs has made great progress, the reliability of fNIRS-BCIs across time and environments has rarely been assessed. In the present fNIRS-BCI study, we tested six healthy participants across three consecutive days using a straightforward four-choice fNIRS-BCI communication paradigm that allows answer encoding based on instructions using various sensory modalities. To encode an answer, participants performed a motor imagery task (mental drawing) in one out of four time periods. Answer encoding was guided by either the visual, auditory, or tactile sensory modality. Two participants were tested outside the laboratory in a cafeteria. Answers were decoded from the time course of the most-informative fNIRS channel-by-chromophore combination. Across the three testing days, we obtained mean single- and multi-trial (joint analysis of four consecutive trials) accuracies of 62.5 and 85.19%, respectively. Obtained multi-trial accuracies were 86.11% for visual, 80.56% for auditory, and 88.89% for tactile sensory encoding. The two participants that used the fNIRS-BCI in a cafeteria obtained the best single- (72.22 and 77.78%) and multi-trial accuracies (100 and 94.44%). Communication was reliable over the three recording sessions with multi-trial accuracies of 86.11% on day 1, 86.11% on day 2, and 83.33% on day 3. To gauge the trade-off between number of optodes and decoding accuracy, averaging across two and three promising fNIRS channels was compared to the one-channel approach. Multi-trial accuracy increased from 85.19% (one-channel approach) to 91.67% (two-/three-channel approach). In sum, the presented fNIRS-BCI yielded robust decoding results using three alternative sensory encoding modalities. Further, fNIRS-BCI communication was stable over the course of three consecutive days, even in a natural (social) environment. Therewith, the developed fNIRS-BCI demonstrated high flexibility, reliability and robustness, crucial requirements for future clinical applicability. Frontiers Media S.A. 2021-11-25 /pmc/articles/PMC8656940/ /pubmed/34899223 http://dx.doi.org/10.3389/fnhum.2021.784522 Text en Copyright © 2021 Nagels-Coune, Riecke, Benitez-Andonegui, Klinkhammer, Goebel, De Weerd, Lührs and Sorger. https://creativecommons.org/licenses/by/4.0/This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
spellingShingle Neuroscience
Nagels-Coune, Laurien
Riecke, Lars
Benitez-Andonegui, Amaia
Klinkhammer, Simona
Goebel, Rainer
De Weerd, Peter
Lührs, Michael
Sorger, Bettina
See, Hear, or Feel – to Speak: A Versatile Multiple-Choice Functional Near-Infrared Spectroscopy-Brain-Computer Interface Feasible With Visual, Auditory, or Tactile Instructions
title See, Hear, or Feel – to Speak: A Versatile Multiple-Choice Functional Near-Infrared Spectroscopy-Brain-Computer Interface Feasible With Visual, Auditory, or Tactile Instructions
title_full See, Hear, or Feel – to Speak: A Versatile Multiple-Choice Functional Near-Infrared Spectroscopy-Brain-Computer Interface Feasible With Visual, Auditory, or Tactile Instructions
title_fullStr See, Hear, or Feel – to Speak: A Versatile Multiple-Choice Functional Near-Infrared Spectroscopy-Brain-Computer Interface Feasible With Visual, Auditory, or Tactile Instructions
title_full_unstemmed See, Hear, or Feel – to Speak: A Versatile Multiple-Choice Functional Near-Infrared Spectroscopy-Brain-Computer Interface Feasible With Visual, Auditory, or Tactile Instructions
title_short See, Hear, or Feel – to Speak: A Versatile Multiple-Choice Functional Near-Infrared Spectroscopy-Brain-Computer Interface Feasible With Visual, Auditory, or Tactile Instructions
title_sort see, hear, or feel – to speak: a versatile multiple-choice functional near-infrared spectroscopy-brain-computer interface feasible with visual, auditory, or tactile instructions
topic Neuroscience
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8656940/
https://www.ncbi.nlm.nih.gov/pubmed/34899223
http://dx.doi.org/10.3389/fnhum.2021.784522
work_keys_str_mv AT nagelscounelaurien seehearorfeeltospeakaversatilemultiplechoicefunctionalnearinfraredspectroscopybraincomputerinterfacefeasiblewithvisualauditoryortactileinstructions
AT rieckelars seehearorfeeltospeakaversatilemultiplechoicefunctionalnearinfraredspectroscopybraincomputerinterfacefeasiblewithvisualauditoryortactileinstructions
AT benitezandoneguiamaia seehearorfeeltospeakaversatilemultiplechoicefunctionalnearinfraredspectroscopybraincomputerinterfacefeasiblewithvisualauditoryortactileinstructions
AT klinkhammersimona seehearorfeeltospeakaversatilemultiplechoicefunctionalnearinfraredspectroscopybraincomputerinterfacefeasiblewithvisualauditoryortactileinstructions
AT goebelrainer seehearorfeeltospeakaversatilemultiplechoicefunctionalnearinfraredspectroscopybraincomputerinterfacefeasiblewithvisualauditoryortactileinstructions
AT deweerdpeter seehearorfeeltospeakaversatilemultiplechoicefunctionalnearinfraredspectroscopybraincomputerinterfacefeasiblewithvisualauditoryortactileinstructions
AT luhrsmichael seehearorfeeltospeakaversatilemultiplechoicefunctionalnearinfraredspectroscopybraincomputerinterfacefeasiblewithvisualauditoryortactileinstructions
AT sorgerbettina seehearorfeeltospeakaversatilemultiplechoicefunctionalnearinfraredspectroscopybraincomputerinterfacefeasiblewithvisualauditoryortactileinstructions