Cargando…
Spatial Processing Is Frequency Specific in Auditory Cortex But Not in the Midbrain
The cochlea behaves like a bank of band-pass filters, segregating information into different frequency channels. Some aspects of perception reflect processing within individual channels, but others involve the integration of information across them. One instance of this is sound localization, which...
Autores principales: | , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Society for Neuroscience
2017
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5511886/ https://www.ncbi.nlm.nih.gov/pubmed/28559383 http://dx.doi.org/10.1523/JNEUROSCI.3034-16.2017 |
_version_ | 1783250409329000448 |
---|---|
author | Sollini, Joseph Mill, Robert Sumner, Christian J. |
author_facet | Sollini, Joseph Mill, Robert Sumner, Christian J. |
author_sort | Sollini, Joseph |
collection | PubMed |
description | The cochlea behaves like a bank of band-pass filters, segregating information into different frequency channels. Some aspects of perception reflect processing within individual channels, but others involve the integration of information across them. One instance of this is sound localization, which improves with increasing bandwidth. The processing of binaural cues for sound location has been studied extensively. However, although the advantage conferred by bandwidth is clear, we currently know little about how this additional information is combined to form our percept of space. We investigated the ability of cells in the auditory system of guinea pigs to compare interaural level differences (ILDs), a key localization cue, between tones of disparate frequencies in each ear. Cells in auditory cortex believed to be integral to ILD processing (excitatory from one ear, inhibitory from the other: EI cells) compare ILDs separately over restricted frequency ranges which are not consistent with their monaural tuning. In contrast, cells that are excitatory from both ears (EE cells) show no evidence of frequency-specific processing. Both cell types are explained by a model in which ILDs are computed within separate frequency channels and subsequently combined in a single cortical cell. Interestingly, ILD processing in all inferior colliculus cell types (EE and EI) is largely consistent with processing within single, matched-frequency channels from each ear. Our data suggest a clear constraint on the way that localization cues are integrated: cortical ILD tuning to broadband sounds is a composite of separate, frequency-specific, binaurally sensitive channels. This frequency-specific processing appears after the level of the midbrain. SIGNIFICANCE STATEMENT For some sensory modalities (e.g., somatosensation, vision), the spatial arrangement of the outside world is inherited by the brain from the periphery. The auditory periphery is arranged spatially by frequency, not spatial location. Therefore, our auditory perception of location must be synthesized from physical cues in separate frequency channels. There are multiple cues (e.g., timing, level, spectral cues), but even single cues (e.g., level differences) are frequency dependent. The synthesis of location must account for this frequency dependence, but it is not known how this might occur. Here, we investigated how interaural-level differences are combined across frequency along the ascending auditory system. We found that the integration in auditory cortex preserves the independence of the different-level cues in different frequency regions. |
format | Online Article Text |
id | pubmed-5511886 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2017 |
publisher | Society for Neuroscience |
record_format | MEDLINE/PubMed |
spelling | pubmed-55118862017-08-08 Spatial Processing Is Frequency Specific in Auditory Cortex But Not in the Midbrain Sollini, Joseph Mill, Robert Sumner, Christian J. J Neurosci Research Articles The cochlea behaves like a bank of band-pass filters, segregating information into different frequency channels. Some aspects of perception reflect processing within individual channels, but others involve the integration of information across them. One instance of this is sound localization, which improves with increasing bandwidth. The processing of binaural cues for sound location has been studied extensively. However, although the advantage conferred by bandwidth is clear, we currently know little about how this additional information is combined to form our percept of space. We investigated the ability of cells in the auditory system of guinea pigs to compare interaural level differences (ILDs), a key localization cue, between tones of disparate frequencies in each ear. Cells in auditory cortex believed to be integral to ILD processing (excitatory from one ear, inhibitory from the other: EI cells) compare ILDs separately over restricted frequency ranges which are not consistent with their monaural tuning. In contrast, cells that are excitatory from both ears (EE cells) show no evidence of frequency-specific processing. Both cell types are explained by a model in which ILDs are computed within separate frequency channels and subsequently combined in a single cortical cell. Interestingly, ILD processing in all inferior colliculus cell types (EE and EI) is largely consistent with processing within single, matched-frequency channels from each ear. Our data suggest a clear constraint on the way that localization cues are integrated: cortical ILD tuning to broadband sounds is a composite of separate, frequency-specific, binaurally sensitive channels. This frequency-specific processing appears after the level of the midbrain. SIGNIFICANCE STATEMENT For some sensory modalities (e.g., somatosensation, vision), the spatial arrangement of the outside world is inherited by the brain from the periphery. The auditory periphery is arranged spatially by frequency, not spatial location. Therefore, our auditory perception of location must be synthesized from physical cues in separate frequency channels. There are multiple cues (e.g., timing, level, spectral cues), but even single cues (e.g., level differences) are frequency dependent. The synthesis of location must account for this frequency dependence, but it is not known how this might occur. Here, we investigated how interaural-level differences are combined across frequency along the ascending auditory system. We found that the integration in auditory cortex preserves the independence of the different-level cues in different frequency regions. Society for Neuroscience 2017-07-05 /pmc/articles/PMC5511886/ /pubmed/28559383 http://dx.doi.org/10.1523/JNEUROSCI.3034-16.2017 Text en Copyright © 2017 Sollini et al. https://creativecommons.org/licenses/by/4.0/ This is an open-access article distributed under the terms of the Creative Commons Attribution License Creative Commons Attribution 4.0 International (https://creativecommons.org/licenses/by/4.0/) , which permits unrestricted use, distribution and reproduction in any medium provided that the original work is properly attributed. |
spellingShingle | Research Articles Sollini, Joseph Mill, Robert Sumner, Christian J. Spatial Processing Is Frequency Specific in Auditory Cortex But Not in the Midbrain |
title | Spatial Processing Is Frequency Specific in Auditory Cortex But Not in the Midbrain |
title_full | Spatial Processing Is Frequency Specific in Auditory Cortex But Not in the Midbrain |
title_fullStr | Spatial Processing Is Frequency Specific in Auditory Cortex But Not in the Midbrain |
title_full_unstemmed | Spatial Processing Is Frequency Specific in Auditory Cortex But Not in the Midbrain |
title_short | Spatial Processing Is Frequency Specific in Auditory Cortex But Not in the Midbrain |
title_sort | spatial processing is frequency specific in auditory cortex but not in the midbrain |
topic | Research Articles |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5511886/ https://www.ncbi.nlm.nih.gov/pubmed/28559383 http://dx.doi.org/10.1523/JNEUROSCI.3034-16.2017 |
work_keys_str_mv | AT sollinijoseph spatialprocessingisfrequencyspecificinauditorycortexbutnotinthemidbrain AT millrobert spatialprocessingisfrequencyspecificinauditorycortexbutnotinthemidbrain AT sumnerchristianj spatialprocessingisfrequencyspecificinauditorycortexbutnotinthemidbrain |