Cargando…

Auditory grouping mechanisms reflect a sound's relative position in a sequence

The human brain uses acoustic cues to decompose complex auditory scenes into its components. For instance to improve communication, a listener can select an individual “stream,” such as a talker in a crowded room, based on cues such as pitch or location. Despite numerous investigations into auditory...

Descripción completa

Detalles Bibliográficos
Autores principales: Hill, Kevin T., Bishop, Christopher W., Miller, Lee M.
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Frontiers Media S.A. 2012
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3370426/
https://www.ncbi.nlm.nih.gov/pubmed/22701410
http://dx.doi.org/10.3389/fnhum.2012.00158
_version_ 1782235142664749056
author Hill, Kevin T.
Bishop, Christopher W.
Miller, Lee M.
author_facet Hill, Kevin T.
Bishop, Christopher W.
Miller, Lee M.
author_sort Hill, Kevin T.
collection PubMed
description The human brain uses acoustic cues to decompose complex auditory scenes into its components. For instance to improve communication, a listener can select an individual “stream,” such as a talker in a crowded room, based on cues such as pitch or location. Despite numerous investigations into auditory streaming, few have demonstrated clear correlates of perception; instead, in many studies perception covaries with changes in physical stimulus properties (e.g., frequency separation). In the current report, we employ a classic ABA streaming paradigm and human electroencephalography (EEG) to disentangle the individual contributions of stimulus properties from changes in auditory perception. We find that changes in perceptual state—that is the perception of one versus two auditory streams with physically identical stimuli—and changes in physical stimulus properties are reflected independently in the event-related potential (ERP) during overlapping time windows. These findings emphasize the necessity of controlling for stimulus properties when studying perceptual effects of streaming. Furthermore, the independence of the perceptual effect from stimulus properties suggests the neural correlates of streaming reflect a tone's relative position within a larger sequence (1st, 2nd, 3rd) rather than its acoustics. By clarifying the role of stimulus attributes along with perceptual changes, this study helps explain precisely how the brain is able to distinguish a sound source of interest in an auditory scene.
format Online
Article
Text
id pubmed-3370426
institution National Center for Biotechnology Information
language English
publishDate 2012
publisher Frontiers Media S.A.
record_format MEDLINE/PubMed
spelling pubmed-33704262012-06-13 Auditory grouping mechanisms reflect a sound's relative position in a sequence Hill, Kevin T. Bishop, Christopher W. Miller, Lee M. Front Hum Neurosci Neuroscience The human brain uses acoustic cues to decompose complex auditory scenes into its components. For instance to improve communication, a listener can select an individual “stream,” such as a talker in a crowded room, based on cues such as pitch or location. Despite numerous investigations into auditory streaming, few have demonstrated clear correlates of perception; instead, in many studies perception covaries with changes in physical stimulus properties (e.g., frequency separation). In the current report, we employ a classic ABA streaming paradigm and human electroencephalography (EEG) to disentangle the individual contributions of stimulus properties from changes in auditory perception. We find that changes in perceptual state—that is the perception of one versus two auditory streams with physically identical stimuli—and changes in physical stimulus properties are reflected independently in the event-related potential (ERP) during overlapping time windows. These findings emphasize the necessity of controlling for stimulus properties when studying perceptual effects of streaming. Furthermore, the independence of the perceptual effect from stimulus properties suggests the neural correlates of streaming reflect a tone's relative position within a larger sequence (1st, 2nd, 3rd) rather than its acoustics. By clarifying the role of stimulus attributes along with perceptual changes, this study helps explain precisely how the brain is able to distinguish a sound source of interest in an auditory scene. Frontiers Media S.A. 2012-06-08 /pmc/articles/PMC3370426/ /pubmed/22701410 http://dx.doi.org/10.3389/fnhum.2012.00158 Text en Copyright © 2012 Hill, Bishop and Miller. http://www.frontiersin.org/licenseagreement This is an open-access article distributed under the terms of the Creative Commons Attribution Non Commercial License, which permits non-commercial use, distribution, and reproduction in other forums, provided the original authors and source are credited.
spellingShingle Neuroscience
Hill, Kevin T.
Bishop, Christopher W.
Miller, Lee M.
Auditory grouping mechanisms reflect a sound's relative position in a sequence
title Auditory grouping mechanisms reflect a sound's relative position in a sequence
title_full Auditory grouping mechanisms reflect a sound's relative position in a sequence
title_fullStr Auditory grouping mechanisms reflect a sound's relative position in a sequence
title_full_unstemmed Auditory grouping mechanisms reflect a sound's relative position in a sequence
title_short Auditory grouping mechanisms reflect a sound's relative position in a sequence
title_sort auditory grouping mechanisms reflect a sound's relative position in a sequence
topic Neuroscience
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3370426/
https://www.ncbi.nlm.nih.gov/pubmed/22701410
http://dx.doi.org/10.3389/fnhum.2012.00158
work_keys_str_mv AT hillkevint auditorygroupingmechanismsreflectasoundsrelativepositioninasequence
AT bishopchristopherw auditorygroupingmechanismsreflectasoundsrelativepositioninasequence
AT millerleem auditorygroupingmechanismsreflectasoundsrelativepositioninasequence