Cargando…

Contextual modulation of primary visual cortex by auditory signals

Early visual cortex receives non-feedforward input from lateral and top-down connections (Muckli & Petro 2013 Curr. Opin. Neurobiol. 23, 195–201. (doi:10.1016/j.conb.2013.01.020)), including long-range projections from auditory areas. Early visual cortex can code for high-level auditory informat...

Descripción completa

Detalles Bibliográficos
Autores principales: Petro, L. S., Paton, A. T., Muckli, L.
Formato: Online Artículo Texto
Lenguaje:English
Publicado: The Royal Society 2017
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5206272/
https://www.ncbi.nlm.nih.gov/pubmed/28044015
http://dx.doi.org/10.1098/rstb.2016.0104
_version_ 1782490233648971776
author Petro, L. S.
Paton, A. T.
Muckli, L.
author_facet Petro, L. S.
Paton, A. T.
Muckli, L.
author_sort Petro, L. S.
collection PubMed
description Early visual cortex receives non-feedforward input from lateral and top-down connections (Muckli & Petro 2013 Curr. Opin. Neurobiol. 23, 195–201. (doi:10.1016/j.conb.2013.01.020)), including long-range projections from auditory areas. Early visual cortex can code for high-level auditory information, with neural patterns representing natural sound stimulation (Vetter et al. 2014 Curr. Biol. 24, 1256–1262. (doi:10.1016/j.cub.2014.04.020)). We discuss a number of questions arising from these findings. What is the adaptive function of bimodal representations in visual cortex? What type of information projects from auditory to visual cortex? What are the anatomical constraints of auditory information in V1, for example, periphery versus fovea, superficial versus deep cortical layers? Is there a putative neural mechanism we can infer from human neuroimaging data and recent theoretical accounts of cortex? We also present data showing we can read out high-level auditory information from the activation patterns of early visual cortex even when visual cortex receives simple visual stimulation, suggesting independent channels for visual and auditory signals in V1. We speculate which cellular mechanisms allow V1 to be contextually modulated by auditory input to facilitate perception, cognition and behaviour. Beyond cortical feedback that facilitates perception, we argue that there is also feedback serving counterfactual processing during imagery, dreaming and mind wandering, which is not relevant for immediate perception but for behaviour and cognition over a longer time frame. This article is part of the themed issue ‘Auditory and visual scene analysis’.
format Online
Article
Text
id pubmed-5206272
institution National Center for Biotechnology Information
language English
publishDate 2017
publisher The Royal Society
record_format MEDLINE/PubMed
spelling pubmed-52062722017-02-19 Contextual modulation of primary visual cortex by auditory signals Petro, L. S. Paton, A. T. Muckli, L. Philos Trans R Soc Lond B Biol Sci Articles Early visual cortex receives non-feedforward input from lateral and top-down connections (Muckli & Petro 2013 Curr. Opin. Neurobiol. 23, 195–201. (doi:10.1016/j.conb.2013.01.020)), including long-range projections from auditory areas. Early visual cortex can code for high-level auditory information, with neural patterns representing natural sound stimulation (Vetter et al. 2014 Curr. Biol. 24, 1256–1262. (doi:10.1016/j.cub.2014.04.020)). We discuss a number of questions arising from these findings. What is the adaptive function of bimodal representations in visual cortex? What type of information projects from auditory to visual cortex? What are the anatomical constraints of auditory information in V1, for example, periphery versus fovea, superficial versus deep cortical layers? Is there a putative neural mechanism we can infer from human neuroimaging data and recent theoretical accounts of cortex? We also present data showing we can read out high-level auditory information from the activation patterns of early visual cortex even when visual cortex receives simple visual stimulation, suggesting independent channels for visual and auditory signals in V1. We speculate which cellular mechanisms allow V1 to be contextually modulated by auditory input to facilitate perception, cognition and behaviour. Beyond cortical feedback that facilitates perception, we argue that there is also feedback serving counterfactual processing during imagery, dreaming and mind wandering, which is not relevant for immediate perception but for behaviour and cognition over a longer time frame. This article is part of the themed issue ‘Auditory and visual scene analysis’. The Royal Society 2017-02-19 /pmc/articles/PMC5206272/ /pubmed/28044015 http://dx.doi.org/10.1098/rstb.2016.0104 Text en © 2017 The Authors. http://creativecommons.org/licenses/by/4.0/ Published by the Royal Society under the terms of the Creative Commons Attribution License http://creativecommons.org/licenses/by/4.0/, which permits unrestricted use, provided the original author and source are credited.
spellingShingle Articles
Petro, L. S.
Paton, A. T.
Muckli, L.
Contextual modulation of primary visual cortex by auditory signals
title Contextual modulation of primary visual cortex by auditory signals
title_full Contextual modulation of primary visual cortex by auditory signals
title_fullStr Contextual modulation of primary visual cortex by auditory signals
title_full_unstemmed Contextual modulation of primary visual cortex by auditory signals
title_short Contextual modulation of primary visual cortex by auditory signals
title_sort contextual modulation of primary visual cortex by auditory signals
topic Articles
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5206272/
https://www.ncbi.nlm.nih.gov/pubmed/28044015
http://dx.doi.org/10.1098/rstb.2016.0104
work_keys_str_mv AT petrols contextualmodulationofprimaryvisualcortexbyauditorysignals
AT patonat contextualmodulationofprimaryvisualcortexbyauditorysignals
AT mucklil contextualmodulationofprimaryvisualcortexbyauditorysignals