Cargando…

Estimating the Intended Sound Direction of the User: Toward an Auditory Brain-Computer Interface Using Out-of-Head Sound Localization

The auditory Brain-Computer Interface (BCI) using electroencephalograms (EEG) is a subject of intensive study. As a cue, auditory BCIs can deal with many of the characteristics of stimuli such as tone, pitch, and voices. Spatial information on auditory stimuli also provides useful information for a...

Descripción completa

Detalles Bibliográficos
Autores principales: Nambu, Isao, Ebisawa, Masashi, Kogure, Masumi, Yano, Shohei, Hokari, Haruhide, Wada, Yasuhiro
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Public Library of Science 2013
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3577758/
https://www.ncbi.nlm.nih.gov/pubmed/23437338
http://dx.doi.org/10.1371/journal.pone.0057174
_version_ 1782259964743516160
author Nambu, Isao
Ebisawa, Masashi
Kogure, Masumi
Yano, Shohei
Hokari, Haruhide
Wada, Yasuhiro
author_facet Nambu, Isao
Ebisawa, Masashi
Kogure, Masumi
Yano, Shohei
Hokari, Haruhide
Wada, Yasuhiro
author_sort Nambu, Isao
collection PubMed
description The auditory Brain-Computer Interface (BCI) using electroencephalograms (EEG) is a subject of intensive study. As a cue, auditory BCIs can deal with many of the characteristics of stimuli such as tone, pitch, and voices. Spatial information on auditory stimuli also provides useful information for a BCI. However, in a portable system, virtual auditory stimuli have to be presented spatially through earphones or headphones, instead of loudspeakers. We investigated the possibility of an auditory BCI using the out-of-head sound localization technique, which enables us to present virtual auditory stimuli to users from any direction, through earphones. The feasibility of a BCI using this technique was evaluated in an EEG oddball experiment and offline analysis. A virtual auditory stimulus was presented to the subject from one of six directions. Using a support vector machine, we were able to classify whether the subject attended the direction of a presented stimulus from EEG signals. The mean accuracy across subjects was 70.0% in the single-trial classification. When we used trial-averaged EEG signals as inputs to the classifier, the mean accuracy across seven subjects reached 89.5% (for 10-trial averaging). Further analysis showed that the P300 event-related potential responses from 200 to 500 ms in central and posterior regions of the brain contributed to the classification. In comparison with the results obtained from a loudspeaker experiment, we confirmed that stimulus presentation by out-of-head sound localization achieved similar event-related potential responses and classification performances. These results suggest that out-of-head sound localization enables us to provide a high-performance and loudspeaker-less portable BCI system.
format Online
Article
Text
id pubmed-3577758
institution National Center for Biotechnology Information
language English
publishDate 2013
publisher Public Library of Science
record_format MEDLINE/PubMed
spelling pubmed-35777582013-02-22 Estimating the Intended Sound Direction of the User: Toward an Auditory Brain-Computer Interface Using Out-of-Head Sound Localization Nambu, Isao Ebisawa, Masashi Kogure, Masumi Yano, Shohei Hokari, Haruhide Wada, Yasuhiro PLoS One Research Article The auditory Brain-Computer Interface (BCI) using electroencephalograms (EEG) is a subject of intensive study. As a cue, auditory BCIs can deal with many of the characteristics of stimuli such as tone, pitch, and voices. Spatial information on auditory stimuli also provides useful information for a BCI. However, in a portable system, virtual auditory stimuli have to be presented spatially through earphones or headphones, instead of loudspeakers. We investigated the possibility of an auditory BCI using the out-of-head sound localization technique, which enables us to present virtual auditory stimuli to users from any direction, through earphones. The feasibility of a BCI using this technique was evaluated in an EEG oddball experiment and offline analysis. A virtual auditory stimulus was presented to the subject from one of six directions. Using a support vector machine, we were able to classify whether the subject attended the direction of a presented stimulus from EEG signals. The mean accuracy across subjects was 70.0% in the single-trial classification. When we used trial-averaged EEG signals as inputs to the classifier, the mean accuracy across seven subjects reached 89.5% (for 10-trial averaging). Further analysis showed that the P300 event-related potential responses from 200 to 500 ms in central and posterior regions of the brain contributed to the classification. In comparison with the results obtained from a loudspeaker experiment, we confirmed that stimulus presentation by out-of-head sound localization achieved similar event-related potential responses and classification performances. These results suggest that out-of-head sound localization enables us to provide a high-performance and loudspeaker-less portable BCI system. Public Library of Science 2013-02-20 /pmc/articles/PMC3577758/ /pubmed/23437338 http://dx.doi.org/10.1371/journal.pone.0057174 Text en © 2013 Nambu et al http://creativecommons.org/licenses/by/4.0/ This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are properly credited.
spellingShingle Research Article
Nambu, Isao
Ebisawa, Masashi
Kogure, Masumi
Yano, Shohei
Hokari, Haruhide
Wada, Yasuhiro
Estimating the Intended Sound Direction of the User: Toward an Auditory Brain-Computer Interface Using Out-of-Head Sound Localization
title Estimating the Intended Sound Direction of the User: Toward an Auditory Brain-Computer Interface Using Out-of-Head Sound Localization
title_full Estimating the Intended Sound Direction of the User: Toward an Auditory Brain-Computer Interface Using Out-of-Head Sound Localization
title_fullStr Estimating the Intended Sound Direction of the User: Toward an Auditory Brain-Computer Interface Using Out-of-Head Sound Localization
title_full_unstemmed Estimating the Intended Sound Direction of the User: Toward an Auditory Brain-Computer Interface Using Out-of-Head Sound Localization
title_short Estimating the Intended Sound Direction of the User: Toward an Auditory Brain-Computer Interface Using Out-of-Head Sound Localization
title_sort estimating the intended sound direction of the user: toward an auditory brain-computer interface using out-of-head sound localization
topic Research Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3577758/
https://www.ncbi.nlm.nih.gov/pubmed/23437338
http://dx.doi.org/10.1371/journal.pone.0057174
work_keys_str_mv AT nambuisao estimatingtheintendedsounddirectionoftheusertowardanauditorybraincomputerinterfaceusingoutofheadsoundlocalization
AT ebisawamasashi estimatingtheintendedsounddirectionoftheusertowardanauditorybraincomputerinterfaceusingoutofheadsoundlocalization
AT koguremasumi estimatingtheintendedsounddirectionoftheusertowardanauditorybraincomputerinterfaceusingoutofheadsoundlocalization
AT yanoshohei estimatingtheintendedsounddirectionoftheusertowardanauditorybraincomputerinterfaceusingoutofheadsoundlocalization
AT hokariharuhide estimatingtheintendedsounddirectionoftheusertowardanauditorybraincomputerinterfaceusingoutofheadsoundlocalization
AT wadayasuhiro estimatingtheintendedsounddirectionoftheusertowardanauditorybraincomputerinterfaceusingoutofheadsoundlocalization