Cargando…

Influence of Auditory Cues on the Neuronal Response to Naturalistic Visual Stimuli in a Virtual Reality Setting

Virtual reality environments offer great opportunities to study the performance of brain-computer interfaces (BCIs) in real-world contexts. As real-world stimuli are typically multimodal, their neuronal integration elicits complex response patterns. To investigate the effect of additional auditory c...

Descripción completa

Detalles Bibliográficos
Autores principales: Al Boustani, George, Weiß, Lennart Jakob Konstantin, Li, Hongwei, Meyer, Svea Marie, Hiendlmeier, Lukas, Rinklin, Philipp, Menze, Bjoern, Hemmert, Werner, Wolfrum, Bernhard
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Frontiers Media S.A. 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9201822/
https://www.ncbi.nlm.nih.gov/pubmed/35721351
http://dx.doi.org/10.3389/fnhum.2022.809293
_version_ 1784728399018721280
author Al Boustani, George
Weiß, Lennart Jakob Konstantin
Li, Hongwei
Meyer, Svea Marie
Hiendlmeier, Lukas
Rinklin, Philipp
Menze, Bjoern
Hemmert, Werner
Wolfrum, Bernhard
author_facet Al Boustani, George
Weiß, Lennart Jakob Konstantin
Li, Hongwei
Meyer, Svea Marie
Hiendlmeier, Lukas
Rinklin, Philipp
Menze, Bjoern
Hemmert, Werner
Wolfrum, Bernhard
author_sort Al Boustani, George
collection PubMed
description Virtual reality environments offer great opportunities to study the performance of brain-computer interfaces (BCIs) in real-world contexts. As real-world stimuli are typically multimodal, their neuronal integration elicits complex response patterns. To investigate the effect of additional auditory cues on the processing of visual information, we used virtual reality to mimic safety-related events in an industrial environment while we concomitantly recorded electroencephalography (EEG) signals. We simulated a box traveling on a conveyor belt system where two types of stimuli – an exploding and a burning box – interrupt regular operation. The recordings from 16 subjects were divided into two subsets, a visual-only and an audio-visual experiment. In the visual-only experiment, the response patterns for both stimuli elicited a similar pattern – a visual evoked potential (VEP) followed by an event-related potential (ERP) over the occipital-parietal lobe. Moreover, we found the perceived severity of the event to be reflected in the signal amplitude. Interestingly, the additional auditory cues had a twofold effect on the previous findings: The P1 component was significantly suppressed in the case of the exploding box stimulus, whereas the N2c showed an enhancement for the burning box stimulus. This result highlights the impact of multisensory integration on the performance of realistic BCI applications. Indeed, we observed alterations in the offline classification accuracy for a detection task based on a mixed feature extraction (variance, power spectral density, and discrete wavelet transform) and a support vector machine classifier. In the case of the explosion, the accuracy slightly decreased by –1.64% p. in an audio-visual experiment compared to the visual-only. Contrarily, the classification accuracy for the burning box increased by 5.58% p. when additional auditory cues were present. Hence, we conclude, that especially in challenging detection tasks, it is favorable to consider the potential of multisensory integration when BCIs are supposed to operate under (multimodal) real-world conditions.
format Online
Article
Text
id pubmed-9201822
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher Frontiers Media S.A.
record_format MEDLINE/PubMed
spelling pubmed-92018222022-06-17 Influence of Auditory Cues on the Neuronal Response to Naturalistic Visual Stimuli in a Virtual Reality Setting Al Boustani, George Weiß, Lennart Jakob Konstantin Li, Hongwei Meyer, Svea Marie Hiendlmeier, Lukas Rinklin, Philipp Menze, Bjoern Hemmert, Werner Wolfrum, Bernhard Front Hum Neurosci Human Neuroscience Virtual reality environments offer great opportunities to study the performance of brain-computer interfaces (BCIs) in real-world contexts. As real-world stimuli are typically multimodal, their neuronal integration elicits complex response patterns. To investigate the effect of additional auditory cues on the processing of visual information, we used virtual reality to mimic safety-related events in an industrial environment while we concomitantly recorded electroencephalography (EEG) signals. We simulated a box traveling on a conveyor belt system where two types of stimuli – an exploding and a burning box – interrupt regular operation. The recordings from 16 subjects were divided into two subsets, a visual-only and an audio-visual experiment. In the visual-only experiment, the response patterns for both stimuli elicited a similar pattern – a visual evoked potential (VEP) followed by an event-related potential (ERP) over the occipital-parietal lobe. Moreover, we found the perceived severity of the event to be reflected in the signal amplitude. Interestingly, the additional auditory cues had a twofold effect on the previous findings: The P1 component was significantly suppressed in the case of the exploding box stimulus, whereas the N2c showed an enhancement for the burning box stimulus. This result highlights the impact of multisensory integration on the performance of realistic BCI applications. Indeed, we observed alterations in the offline classification accuracy for a detection task based on a mixed feature extraction (variance, power spectral density, and discrete wavelet transform) and a support vector machine classifier. In the case of the explosion, the accuracy slightly decreased by –1.64% p. in an audio-visual experiment compared to the visual-only. Contrarily, the classification accuracy for the burning box increased by 5.58% p. when additional auditory cues were present. Hence, we conclude, that especially in challenging detection tasks, it is favorable to consider the potential of multisensory integration when BCIs are supposed to operate under (multimodal) real-world conditions. Frontiers Media S.A. 2022-06-02 /pmc/articles/PMC9201822/ /pubmed/35721351 http://dx.doi.org/10.3389/fnhum.2022.809293 Text en Copyright © 2022 Al Boustani, Weiß, Li, Meyer, Hiendlmeier, Rinklin, Menze, Hemmert and Wolfrum. https://creativecommons.org/licenses/by/4.0/This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
spellingShingle Human Neuroscience
Al Boustani, George
Weiß, Lennart Jakob Konstantin
Li, Hongwei
Meyer, Svea Marie
Hiendlmeier, Lukas
Rinklin, Philipp
Menze, Bjoern
Hemmert, Werner
Wolfrum, Bernhard
Influence of Auditory Cues on the Neuronal Response to Naturalistic Visual Stimuli in a Virtual Reality Setting
title Influence of Auditory Cues on the Neuronal Response to Naturalistic Visual Stimuli in a Virtual Reality Setting
title_full Influence of Auditory Cues on the Neuronal Response to Naturalistic Visual Stimuli in a Virtual Reality Setting
title_fullStr Influence of Auditory Cues on the Neuronal Response to Naturalistic Visual Stimuli in a Virtual Reality Setting
title_full_unstemmed Influence of Auditory Cues on the Neuronal Response to Naturalistic Visual Stimuli in a Virtual Reality Setting
title_short Influence of Auditory Cues on the Neuronal Response to Naturalistic Visual Stimuli in a Virtual Reality Setting
title_sort influence of auditory cues on the neuronal response to naturalistic visual stimuli in a virtual reality setting
topic Human Neuroscience
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9201822/
https://www.ncbi.nlm.nih.gov/pubmed/35721351
http://dx.doi.org/10.3389/fnhum.2022.809293
work_keys_str_mv AT alboustanigeorge influenceofauditorycuesontheneuronalresponsetonaturalisticvisualstimuliinavirtualrealitysetting
AT weißlennartjakobkonstantin influenceofauditorycuesontheneuronalresponsetonaturalisticvisualstimuliinavirtualrealitysetting
AT lihongwei influenceofauditorycuesontheneuronalresponsetonaturalisticvisualstimuliinavirtualrealitysetting
AT meyersveamarie influenceofauditorycuesontheneuronalresponsetonaturalisticvisualstimuliinavirtualrealitysetting
AT hiendlmeierlukas influenceofauditorycuesontheneuronalresponsetonaturalisticvisualstimuliinavirtualrealitysetting
AT rinklinphilipp influenceofauditorycuesontheneuronalresponsetonaturalisticvisualstimuliinavirtualrealitysetting
AT menzebjoern influenceofauditorycuesontheneuronalresponsetonaturalisticvisualstimuliinavirtualrealitysetting
AT hemmertwerner influenceofauditorycuesontheneuronalresponsetonaturalisticvisualstimuliinavirtualrealitysetting
AT wolfrumbernhard influenceofauditorycuesontheneuronalresponsetonaturalisticvisualstimuliinavirtualrealitysetting