Cargando…

Interactions between egocentric and allocentric spatial coding of sounds revealed by a multisensory learning paradigm

Although sound position is initially head-centred (egocentric coordinates), our brain can also represent sounds relative to one another (allocentric coordinates). Whether reference frames for spatial hearing are independent or interact remained largely unexplored. Here we developed a new allocentric...

Descripción completa

Detalles Bibliográficos
Autores principales: Rabini, Giuseppe, Altobelli, Elena, Pavani, Francesco
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Nature Publishing Group UK 2019
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6536515/
https://www.ncbi.nlm.nih.gov/pubmed/31133688
http://dx.doi.org/10.1038/s41598-019-44267-3
_version_ 1783421760442466304
author Rabini, Giuseppe
Altobelli, Elena
Pavani, Francesco
author_facet Rabini, Giuseppe
Altobelli, Elena
Pavani, Francesco
author_sort Rabini, Giuseppe
collection PubMed
description Although sound position is initially head-centred (egocentric coordinates), our brain can also represent sounds relative to one another (allocentric coordinates). Whether reference frames for spatial hearing are independent or interact remained largely unexplored. Here we developed a new allocentric spatial-hearing training and tested whether it can improve egocentric sound-localisation performance in normal-hearing adults listening with one ear plugged. Two groups of participants (N = 15 each) performed an egocentric sound-localisation task (point to a syllable), in monaural listening, before and after 4-days of multisensory training on triplets of white-noise bursts paired with occasional visual feedback. Critically, one group performed an allocentric task (auditory bisection task), whereas the other processed the same stimuli to perform an egocentric task (pointing to a designated sound of the triplet). Unlike most previous works, we tested also a no training group (N = 15). Egocentric sound-localisation abilities in the horizontal plane improved for all groups in the space ipsilateral to the ear-plug. This unexpected finding highlights the importance of including a no training group when studying sound localisation re-learning. Yet, performance changes were qualitatively different in trained compared to untrained participants, providing initial evidence that allocentric and multisensory procedures may prove useful when aiming to promote sound localisation re-learning.
format Online
Article
Text
id pubmed-6536515
institution National Center for Biotechnology Information
language English
publishDate 2019
publisher Nature Publishing Group UK
record_format MEDLINE/PubMed
spelling pubmed-65365152019-06-06 Interactions between egocentric and allocentric spatial coding of sounds revealed by a multisensory learning paradigm Rabini, Giuseppe Altobelli, Elena Pavani, Francesco Sci Rep Article Although sound position is initially head-centred (egocentric coordinates), our brain can also represent sounds relative to one another (allocentric coordinates). Whether reference frames for spatial hearing are independent or interact remained largely unexplored. Here we developed a new allocentric spatial-hearing training and tested whether it can improve egocentric sound-localisation performance in normal-hearing adults listening with one ear plugged. Two groups of participants (N = 15 each) performed an egocentric sound-localisation task (point to a syllable), in monaural listening, before and after 4-days of multisensory training on triplets of white-noise bursts paired with occasional visual feedback. Critically, one group performed an allocentric task (auditory bisection task), whereas the other processed the same stimuli to perform an egocentric task (pointing to a designated sound of the triplet). Unlike most previous works, we tested also a no training group (N = 15). Egocentric sound-localisation abilities in the horizontal plane improved for all groups in the space ipsilateral to the ear-plug. This unexpected finding highlights the importance of including a no training group when studying sound localisation re-learning. Yet, performance changes were qualitatively different in trained compared to untrained participants, providing initial evidence that allocentric and multisensory procedures may prove useful when aiming to promote sound localisation re-learning. Nature Publishing Group UK 2019-05-27 /pmc/articles/PMC6536515/ /pubmed/31133688 http://dx.doi.org/10.1038/s41598-019-44267-3 Text en © The Author(s) 2019 Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/.
spellingShingle Article
Rabini, Giuseppe
Altobelli, Elena
Pavani, Francesco
Interactions between egocentric and allocentric spatial coding of sounds revealed by a multisensory learning paradigm
title Interactions between egocentric and allocentric spatial coding of sounds revealed by a multisensory learning paradigm
title_full Interactions between egocentric and allocentric spatial coding of sounds revealed by a multisensory learning paradigm
title_fullStr Interactions between egocentric and allocentric spatial coding of sounds revealed by a multisensory learning paradigm
title_full_unstemmed Interactions between egocentric and allocentric spatial coding of sounds revealed by a multisensory learning paradigm
title_short Interactions between egocentric and allocentric spatial coding of sounds revealed by a multisensory learning paradigm
title_sort interactions between egocentric and allocentric spatial coding of sounds revealed by a multisensory learning paradigm
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6536515/
https://www.ncbi.nlm.nih.gov/pubmed/31133688
http://dx.doi.org/10.1038/s41598-019-44267-3
work_keys_str_mv AT rabinigiuseppe interactionsbetweenegocentricandallocentricspatialcodingofsoundsrevealedbyamultisensorylearningparadigm
AT altobellielena interactionsbetweenegocentricandallocentricspatialcodingofsoundsrevealedbyamultisensorylearningparadigm
AT pavanifrancesco interactionsbetweenegocentricandallocentricspatialcodingofsoundsrevealedbyamultisensorylearningparadigm