Cargando…

Multiple spatial reference frames underpin perceptual recalibration to audio-visual discrepancies

In dynamic multisensory environments, the perceptual system corrects for discrepancies arising between modalities. For instance, in the ventriloquism aftereffect (VAE), spatial disparities introduced between visual and auditory stimuli lead to a perceptual recalibration of auditory space. Previous r...

Descripción completa

Detalles Bibliográficos
Autores principales: Watson, David Mark, Akeroyd, Michael A., Roach, Neil W., Webb, Ben S.
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Public Library of Science 2021
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8128243/
https://www.ncbi.nlm.nih.gov/pubmed/33999940
http://dx.doi.org/10.1371/journal.pone.0251827
_version_ 1783694082295463936
author Watson, David Mark
Akeroyd, Michael A.
Roach, Neil W.
Webb, Ben S.
author_facet Watson, David Mark
Akeroyd, Michael A.
Roach, Neil W.
Webb, Ben S.
author_sort Watson, David Mark
collection PubMed
description In dynamic multisensory environments, the perceptual system corrects for discrepancies arising between modalities. For instance, in the ventriloquism aftereffect (VAE), spatial disparities introduced between visual and auditory stimuli lead to a perceptual recalibration of auditory space. Previous research has shown that the VAE is underpinned by multiple recalibration mechanisms tuned to different timescales, however it remains unclear whether these mechanisms use common or distinct spatial reference frames. Here we asked whether the VAE operates in eye- or head-centred reference frames across a range of adaptation timescales, from a few seconds to a few minutes. We developed a novel paradigm for selectively manipulating the contribution of eye- versus head-centred visual signals to the VAE by manipulating auditory locations relative to either the head orientation or the point of fixation. Consistent with previous research, we found both eye- and head-centred frames contributed to the VAE across all timescales. However, we found no evidence for an interaction between spatial reference frames and adaptation duration. Our results indicate that the VAE is underpinned by multiple spatial reference frames that are similarly leveraged by the underlying time-sensitive mechanisms.
format Online
Article
Text
id pubmed-8128243
institution National Center for Biotechnology Information
language English
publishDate 2021
publisher Public Library of Science
record_format MEDLINE/PubMed
spelling pubmed-81282432021-05-27 Multiple spatial reference frames underpin perceptual recalibration to audio-visual discrepancies Watson, David Mark Akeroyd, Michael A. Roach, Neil W. Webb, Ben S. PLoS One Research Article In dynamic multisensory environments, the perceptual system corrects for discrepancies arising between modalities. For instance, in the ventriloquism aftereffect (VAE), spatial disparities introduced between visual and auditory stimuli lead to a perceptual recalibration of auditory space. Previous research has shown that the VAE is underpinned by multiple recalibration mechanisms tuned to different timescales, however it remains unclear whether these mechanisms use common or distinct spatial reference frames. Here we asked whether the VAE operates in eye- or head-centred reference frames across a range of adaptation timescales, from a few seconds to a few minutes. We developed a novel paradigm for selectively manipulating the contribution of eye- versus head-centred visual signals to the VAE by manipulating auditory locations relative to either the head orientation or the point of fixation. Consistent with previous research, we found both eye- and head-centred frames contributed to the VAE across all timescales. However, we found no evidence for an interaction between spatial reference frames and adaptation duration. Our results indicate that the VAE is underpinned by multiple spatial reference frames that are similarly leveraged by the underlying time-sensitive mechanisms. Public Library of Science 2021-05-17 /pmc/articles/PMC8128243/ /pubmed/33999940 http://dx.doi.org/10.1371/journal.pone.0251827 Text en © 2021 Watson et al https://creativecommons.org/licenses/by/4.0/This is an open access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/) , which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
spellingShingle Research Article
Watson, David Mark
Akeroyd, Michael A.
Roach, Neil W.
Webb, Ben S.
Multiple spatial reference frames underpin perceptual recalibration to audio-visual discrepancies
title Multiple spatial reference frames underpin perceptual recalibration to audio-visual discrepancies
title_full Multiple spatial reference frames underpin perceptual recalibration to audio-visual discrepancies
title_fullStr Multiple spatial reference frames underpin perceptual recalibration to audio-visual discrepancies
title_full_unstemmed Multiple spatial reference frames underpin perceptual recalibration to audio-visual discrepancies
title_short Multiple spatial reference frames underpin perceptual recalibration to audio-visual discrepancies
title_sort multiple spatial reference frames underpin perceptual recalibration to audio-visual discrepancies
topic Research Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8128243/
https://www.ncbi.nlm.nih.gov/pubmed/33999940
http://dx.doi.org/10.1371/journal.pone.0251827
work_keys_str_mv AT watsondavidmark multiplespatialreferenceframesunderpinperceptualrecalibrationtoaudiovisualdiscrepancies
AT akeroydmichaela multiplespatialreferenceframesunderpinperceptualrecalibrationtoaudiovisualdiscrepancies
AT roachneilw multiplespatialreferenceframesunderpinperceptualrecalibrationtoaudiovisualdiscrepancies
AT webbbens multiplespatialreferenceframesunderpinperceptualrecalibrationtoaudiovisualdiscrepancies