Cargando…

Modulation of early auditory processing by visual information: Prediction or bimodal integration?

What happens if a visual cue misleads auditory expectations? Previous studies revealed an early visuo–auditory incongruency effect, so-called incongruency response (IR) of the auditory event-related brain potential (ERP), occurring 100 ms after onset of the sound being incongruent to the preceding v...

Descripción completa

Detalles Bibliográficos
Autores principales: Stuckenberg, Maria V., Schröger, Erich, Widmann, Andreas
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Springer US 2021
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8084811/
https://www.ncbi.nlm.nih.gov/pubmed/33506354
http://dx.doi.org/10.3758/s13414-021-02240-1
_version_ 1783686230346563584
author Stuckenberg, Maria V.
Schröger, Erich
Widmann, Andreas
author_facet Stuckenberg, Maria V.
Schröger, Erich
Widmann, Andreas
author_sort Stuckenberg, Maria V.
collection PubMed
description What happens if a visual cue misleads auditory expectations? Previous studies revealed an early visuo–auditory incongruency effect, so-called incongruency response (IR) of the auditory event-related brain potential (ERP), occurring 100 ms after onset of the sound being incongruent to the preceding visual cue. So far, this effect has been ascribed to reflect the mismatch between auditory sensory expectation activated by visual predictive information and the actual sensory input. Thus, an IR should be confined to an asynchronous presentation of visual cue and sound. Alternatively, one could argue that frequently presented congruent visual-cue–sound combinations are integrated into a bimodal representation whereby violation of the visual–auditory relationship results in a bimodal feature mismatch (the IR should be obtained with asynchronous and with synchronous presentation). In an asynchronous condition, an either high-pitched or low-pitched sound was preceded by a visual note symbol presented above or below a fixation cross (90% congruent; 10% incongruent), while in a synchronous condition, both were presented simultaneously. High-pitched and low-pitched sounds were presented with different probabilities (83% vs. 17%) to form a strong association between bimodal stimuli. In both conditions, tones with pitch incongruent with the location of the note symbols elicited incongruency effects in the N2 and P3 ERPs; however, the IR was only elicited in the asynchronous condition. This finding supports the sensorial prediction error hypothesis stating that the amplitude of the auditory ERP 100 ms after sound onset is enhanced in response to unexpected compared with expected but otherwise identical sounds. SUPPLEMENTARY INFORMATION: The online version contains supplementary material available at 10.3758/s13414-021-02240-1.
format Online
Article
Text
id pubmed-8084811
institution National Center for Biotechnology Information
language English
publishDate 2021
publisher Springer US
record_format MEDLINE/PubMed
spelling pubmed-80848112021-05-05 Modulation of early auditory processing by visual information: Prediction or bimodal integration? Stuckenberg, Maria V. Schröger, Erich Widmann, Andreas Atten Percept Psychophys Article What happens if a visual cue misleads auditory expectations? Previous studies revealed an early visuo–auditory incongruency effect, so-called incongruency response (IR) of the auditory event-related brain potential (ERP), occurring 100 ms after onset of the sound being incongruent to the preceding visual cue. So far, this effect has been ascribed to reflect the mismatch between auditory sensory expectation activated by visual predictive information and the actual sensory input. Thus, an IR should be confined to an asynchronous presentation of visual cue and sound. Alternatively, one could argue that frequently presented congruent visual-cue–sound combinations are integrated into a bimodal representation whereby violation of the visual–auditory relationship results in a bimodal feature mismatch (the IR should be obtained with asynchronous and with synchronous presentation). In an asynchronous condition, an either high-pitched or low-pitched sound was preceded by a visual note symbol presented above or below a fixation cross (90% congruent; 10% incongruent), while in a synchronous condition, both were presented simultaneously. High-pitched and low-pitched sounds were presented with different probabilities (83% vs. 17%) to form a strong association between bimodal stimuli. In both conditions, tones with pitch incongruent with the location of the note symbols elicited incongruency effects in the N2 and P3 ERPs; however, the IR was only elicited in the asynchronous condition. This finding supports the sensorial prediction error hypothesis stating that the amplitude of the auditory ERP 100 ms after sound onset is enhanced in response to unexpected compared with expected but otherwise identical sounds. SUPPLEMENTARY INFORMATION: The online version contains supplementary material available at 10.3758/s13414-021-02240-1. Springer US 2021-01-27 2021 /pmc/articles/PMC8084811/ /pubmed/33506354 http://dx.doi.org/10.3758/s13414-021-02240-1 Text en © The Author(s) 2021 https://creativecommons.org/licenses/by/4.0/Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) .
spellingShingle Article
Stuckenberg, Maria V.
Schröger, Erich
Widmann, Andreas
Modulation of early auditory processing by visual information: Prediction or bimodal integration?
title Modulation of early auditory processing by visual information: Prediction or bimodal integration?
title_full Modulation of early auditory processing by visual information: Prediction or bimodal integration?
title_fullStr Modulation of early auditory processing by visual information: Prediction or bimodal integration?
title_full_unstemmed Modulation of early auditory processing by visual information: Prediction or bimodal integration?
title_short Modulation of early auditory processing by visual information: Prediction or bimodal integration?
title_sort modulation of early auditory processing by visual information: prediction or bimodal integration?
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8084811/
https://www.ncbi.nlm.nih.gov/pubmed/33506354
http://dx.doi.org/10.3758/s13414-021-02240-1
work_keys_str_mv AT stuckenbergmariav modulationofearlyauditoryprocessingbyvisualinformationpredictionorbimodalintegration
AT schrogererich modulationofearlyauditoryprocessingbyvisualinformationpredictionorbimodalintegration
AT widmannandreas modulationofearlyauditoryprocessingbyvisualinformationpredictionorbimodalintegration