Cargando…

Decoding spatial attention with EEG and virtual acoustic space

Decoding spatial attention based on brain signals has wide applications in brain–computer interface (BCI). Previous BCI systems mostly relied on visual patterns or auditory stimulation (e.g., loudspeakers) to evoke synchronous brain signals. There would be difficulties to cover a large range of spat...

Descripción completa

Detalles Bibliográficos
Autores principales: Dong, Yue, Raif, Kaan E., Determan, Sarah C., Gai, Yan
Formato: Online Artículo Texto
Lenguaje:English
Publicado: John Wiley and Sons Inc. 2017
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5704085/
https://www.ncbi.nlm.nih.gov/pubmed/29180483
http://dx.doi.org/10.14814/phy2.13512
_version_ 1783281812316880896
author Dong, Yue
Raif, Kaan E.
Determan, Sarah C.
Gai, Yan
author_facet Dong, Yue
Raif, Kaan E.
Determan, Sarah C.
Gai, Yan
author_sort Dong, Yue
collection PubMed
description Decoding spatial attention based on brain signals has wide applications in brain–computer interface (BCI). Previous BCI systems mostly relied on visual patterns or auditory stimulation (e.g., loudspeakers) to evoke synchronous brain signals. There would be difficulties to cover a large range of spatial locations with such a stimulation protocol. The present study explored the possibility of using virtual acoustic space and a visual‐auditory matching paradigm to overcome this issue. The technique has the flexibility of generating sound stimulation from virtually any spatial location. Brain signals of eight human subjects were obtained with a 32‐channel Electroencephalogram (EEG). Two amplitude‐modulated noise or speech sentences carrying distinct spatial information were presented concurrently. Each sound source was tagged with a unique modulation phase so that the phase of the recorded EEG signals indicated the sound being attended to. The phase‐tagged sound was further filtered with head‐related transfer functions to create the sense of virtual space. Subjects were required to pay attention to the sound source that best matched the location of a visual target. For all the subjects, the phase of a single sound could be accurately reflected over the majority of electrodes based on EEG responses of 90 s or less. The electrodes providing significant decoding performance on auditory attention were fewer and may require longer EEG responses. The reliability and efficiency of decoding with a single electrode varied with subjects. Overall, the virtual acoustic space protocol has the potential of being used in practical BCI systems.
format Online
Article
Text
id pubmed-5704085
institution National Center for Biotechnology Information
language English
publishDate 2017
publisher John Wiley and Sons Inc.
record_format MEDLINE/PubMed
spelling pubmed-57040852017-11-30 Decoding spatial attention with EEG and virtual acoustic space Dong, Yue Raif, Kaan E. Determan, Sarah C. Gai, Yan Physiol Rep Original Research Decoding spatial attention based on brain signals has wide applications in brain–computer interface (BCI). Previous BCI systems mostly relied on visual patterns or auditory stimulation (e.g., loudspeakers) to evoke synchronous brain signals. There would be difficulties to cover a large range of spatial locations with such a stimulation protocol. The present study explored the possibility of using virtual acoustic space and a visual‐auditory matching paradigm to overcome this issue. The technique has the flexibility of generating sound stimulation from virtually any spatial location. Brain signals of eight human subjects were obtained with a 32‐channel Electroencephalogram (EEG). Two amplitude‐modulated noise or speech sentences carrying distinct spatial information were presented concurrently. Each sound source was tagged with a unique modulation phase so that the phase of the recorded EEG signals indicated the sound being attended to. The phase‐tagged sound was further filtered with head‐related transfer functions to create the sense of virtual space. Subjects were required to pay attention to the sound source that best matched the location of a visual target. For all the subjects, the phase of a single sound could be accurately reflected over the majority of electrodes based on EEG responses of 90 s or less. The electrodes providing significant decoding performance on auditory attention were fewer and may require longer EEG responses. The reliability and efficiency of decoding with a single electrode varied with subjects. Overall, the virtual acoustic space protocol has the potential of being used in practical BCI systems. John Wiley and Sons Inc. 2017-11-28 /pmc/articles/PMC5704085/ /pubmed/29180483 http://dx.doi.org/10.14814/phy2.13512 Text en © 2017 Saint Louis University. Physiological Reports published by Wiley Periodicals, Inc. on behalf of The Physiological Society and the American Physiological Society This is an open access article under the terms of the Creative Commons Attribution (http://creativecommons.org/licenses/by/4.0/) License, which permits use, distribution and reproduction in any medium, provided the original work is properly cited.
spellingShingle Original Research
Dong, Yue
Raif, Kaan E.
Determan, Sarah C.
Gai, Yan
Decoding spatial attention with EEG and virtual acoustic space
title Decoding spatial attention with EEG and virtual acoustic space
title_full Decoding spatial attention with EEG and virtual acoustic space
title_fullStr Decoding spatial attention with EEG and virtual acoustic space
title_full_unstemmed Decoding spatial attention with EEG and virtual acoustic space
title_short Decoding spatial attention with EEG and virtual acoustic space
title_sort decoding spatial attention with eeg and virtual acoustic space
topic Original Research
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5704085/
https://www.ncbi.nlm.nih.gov/pubmed/29180483
http://dx.doi.org/10.14814/phy2.13512
work_keys_str_mv AT dongyue decodingspatialattentionwitheegandvirtualacousticspace
AT raifkaane decodingspatialattentionwitheegandvirtualacousticspace
AT determansarahc decodingspatialattentionwitheegandvirtualacousticspace
AT gaiyan decodingspatialattentionwitheegandvirtualacousticspace