Cargando…

Integration of Visual Information in Auditory Cortex Promotes Auditory Scene Analysis through Multisensory Binding

How and where in the brain audio-visual signals are bound to create multimodal objects remains unknown. One hypothesis is that temporal coherence between dynamic multisensory signals provides a mechanism for binding stimulus features across sensory modalities. Here, we report that when the luminance...

Descripción completa

Detalles Bibliográficos
Autores principales: Atilgan, Huriye, Town, Stephen M., Wood, Katherine C., Jones, Gareth P., Maddox, Ross K., Lee, Adrian K.C., Bizley, Jennifer K.
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Cell Press 2018
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5814679/
https://www.ncbi.nlm.nih.gov/pubmed/29395914
http://dx.doi.org/10.1016/j.neuron.2017.12.034
Descripción
Sumario:How and where in the brain audio-visual signals are bound to create multimodal objects remains unknown. One hypothesis is that temporal coherence between dynamic multisensory signals provides a mechanism for binding stimulus features across sensory modalities. Here, we report that when the luminance of a visual stimulus is temporally coherent with the amplitude fluctuations of one sound in a mixture, the representation of that sound is enhanced in auditory cortex. Critically, this enhancement extends to include both binding and non-binding features of the sound. We demonstrate that visual information conveyed from visual cortex via the phase of the local field potential is combined with auditory information within auditory cortex. These data provide evidence that early cross-sensory binding provides a bottom-up mechanism for the formation of cross-sensory objects and that one role for multisensory binding in auditory cortex is to support auditory scene analysis.