Cargando…
Auditory motion perception emerges from successive sound localizations integrated over time
Humans rely on auditory information to estimate the path of moving sound sources. But unlike in vision, the existence of motion-sensitive mechanisms in audition is still open to debate. Psychophysical studies indicate that auditory motion perception emerges from successive localization, but existing...
Autores principales: | , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Nature Publishing Group UK
2019
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6848124/ https://www.ncbi.nlm.nih.gov/pubmed/31712688 http://dx.doi.org/10.1038/s41598-019-52742-0 |
_version_ | 1783469027207675904 |
---|---|
author | Roggerone, Vincent Vacher, Jonathan Tarlao, Cynthia Guastavino, Catherine |
author_facet | Roggerone, Vincent Vacher, Jonathan Tarlao, Cynthia Guastavino, Catherine |
author_sort | Roggerone, Vincent |
collection | PubMed |
description | Humans rely on auditory information to estimate the path of moving sound sources. But unlike in vision, the existence of motion-sensitive mechanisms in audition is still open to debate. Psychophysical studies indicate that auditory motion perception emerges from successive localization, but existing models fail to predict experimental results. However, these models do not account for any temporal integration. We propose a new model tracking motion using successive localization snapshots but integrated over time. This model is derived from psychophysical experiments on the upper limit for circular auditory motion perception (UL), defined as the speed above which humans no longer identify the direction of sounds spinning around them. Our model predicts ULs measured with different stimuli using solely static localization cues. The temporal integration blurs these localization cues rendering them unreliable at high speeds, which results in the UL. Our findings indicate that auditory motion perception does not require motion-sensitive mechanisms. |
format | Online Article Text |
id | pubmed-6848124 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2019 |
publisher | Nature Publishing Group UK |
record_format | MEDLINE/PubMed |
spelling | pubmed-68481242019-11-19 Auditory motion perception emerges from successive sound localizations integrated over time Roggerone, Vincent Vacher, Jonathan Tarlao, Cynthia Guastavino, Catherine Sci Rep Article Humans rely on auditory information to estimate the path of moving sound sources. But unlike in vision, the existence of motion-sensitive mechanisms in audition is still open to debate. Psychophysical studies indicate that auditory motion perception emerges from successive localization, but existing models fail to predict experimental results. However, these models do not account for any temporal integration. We propose a new model tracking motion using successive localization snapshots but integrated over time. This model is derived from psychophysical experiments on the upper limit for circular auditory motion perception (UL), defined as the speed above which humans no longer identify the direction of sounds spinning around them. Our model predicts ULs measured with different stimuli using solely static localization cues. The temporal integration blurs these localization cues rendering them unreliable at high speeds, which results in the UL. Our findings indicate that auditory motion perception does not require motion-sensitive mechanisms. Nature Publishing Group UK 2019-11-11 /pmc/articles/PMC6848124/ /pubmed/31712688 http://dx.doi.org/10.1038/s41598-019-52742-0 Text en © The Author(s) 2019 Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/. |
spellingShingle | Article Roggerone, Vincent Vacher, Jonathan Tarlao, Cynthia Guastavino, Catherine Auditory motion perception emerges from successive sound localizations integrated over time |
title | Auditory motion perception emerges from successive sound localizations integrated over time |
title_full | Auditory motion perception emerges from successive sound localizations integrated over time |
title_fullStr | Auditory motion perception emerges from successive sound localizations integrated over time |
title_full_unstemmed | Auditory motion perception emerges from successive sound localizations integrated over time |
title_short | Auditory motion perception emerges from successive sound localizations integrated over time |
title_sort | auditory motion perception emerges from successive sound localizations integrated over time |
topic | Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6848124/ https://www.ncbi.nlm.nih.gov/pubmed/31712688 http://dx.doi.org/10.1038/s41598-019-52742-0 |
work_keys_str_mv | AT roggeronevincent auditorymotionperceptionemergesfromsuccessivesoundlocalizationsintegratedovertime AT vacherjonathan auditorymotionperceptionemergesfromsuccessivesoundlocalizationsintegratedovertime AT tarlaocynthia auditorymotionperceptionemergesfromsuccessivesoundlocalizationsintegratedovertime AT guastavinocatherine auditorymotionperceptionemergesfromsuccessivesoundlocalizationsintegratedovertime |