Cargando…

Asynchrony adaptation reveals neural population code for audio-visual timing

The relative timing of auditory and visual stimuli is a critical cue for determining whether sensory signals relate to a common source and for making inferences about causality. However, the way in which the brain represents temporal relationships remains poorly understood. Recent studies indicate t...

Descripción completa

Detalles Bibliográficos
Autores principales: Roach, Neil W., Heron, James, Whitaker, David, McGraw, Paul V.
Formato: Texto
Lenguaje:English
Publicado: The Royal Society 2011
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3061136/
https://www.ncbi.nlm.nih.gov/pubmed/20961905
http://dx.doi.org/10.1098/rspb.2010.1737
_version_ 1782200580900388864
author Roach, Neil W.
Heron, James
Whitaker, David
McGraw, Paul V.
author_facet Roach, Neil W.
Heron, James
Whitaker, David
McGraw, Paul V.
author_sort Roach, Neil W.
collection PubMed
description The relative timing of auditory and visual stimuli is a critical cue for determining whether sensory signals relate to a common source and for making inferences about causality. However, the way in which the brain represents temporal relationships remains poorly understood. Recent studies indicate that our perception of multisensory timing is flexible—adaptation to a regular inter-modal delay alters the point at which subsequent stimuli are judged to be simultaneous. Here, we measure the effect of audio-visual asynchrony adaptation on the perception of a wide range of sub-second temporal relationships. We find distinctive patterns of induced biases that are inconsistent with the previous explanations based on changes in perceptual latency. Instead, our results can be well accounted for by a neural population coding model in which: (i) relative audio-visual timing is represented by the distributed activity across a relatively small number of neurons tuned to different delays; (ii) the algorithm for reading out this population code is efficient, but subject to biases owing to under-sampling; and (iii) the effect of adaptation is to modify neuronal response gain. These results suggest that multisensory timing information is represented by a dedicated population code and that shifts in perceived simultaneity following asynchrony adaptation arise from analogous neural processes to well-known perceptual after-effects.
format Text
id pubmed-3061136
institution National Center for Biotechnology Information
language English
publishDate 2011
publisher The Royal Society
record_format MEDLINE/PubMed
spelling pubmed-30611362011-03-28 Asynchrony adaptation reveals neural population code for audio-visual timing Roach, Neil W. Heron, James Whitaker, David McGraw, Paul V. Proc Biol Sci Research Articles The relative timing of auditory and visual stimuli is a critical cue for determining whether sensory signals relate to a common source and for making inferences about causality. However, the way in which the brain represents temporal relationships remains poorly understood. Recent studies indicate that our perception of multisensory timing is flexible—adaptation to a regular inter-modal delay alters the point at which subsequent stimuli are judged to be simultaneous. Here, we measure the effect of audio-visual asynchrony adaptation on the perception of a wide range of sub-second temporal relationships. We find distinctive patterns of induced biases that are inconsistent with the previous explanations based on changes in perceptual latency. Instead, our results can be well accounted for by a neural population coding model in which: (i) relative audio-visual timing is represented by the distributed activity across a relatively small number of neurons tuned to different delays; (ii) the algorithm for reading out this population code is efficient, but subject to biases owing to under-sampling; and (iii) the effect of adaptation is to modify neuronal response gain. These results suggest that multisensory timing information is represented by a dedicated population code and that shifts in perceived simultaneity following asynchrony adaptation arise from analogous neural processes to well-known perceptual after-effects. The Royal Society 2011-05-07 2010-10-20 /pmc/articles/PMC3061136/ /pubmed/20961905 http://dx.doi.org/10.1098/rspb.2010.1737 Text en This Journal is © 2010 The Royal Society http://creativecommons.org/licenses/by/2.0/ This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
spellingShingle Research Articles
Roach, Neil W.
Heron, James
Whitaker, David
McGraw, Paul V.
Asynchrony adaptation reveals neural population code for audio-visual timing
title Asynchrony adaptation reveals neural population code for audio-visual timing
title_full Asynchrony adaptation reveals neural population code for audio-visual timing
title_fullStr Asynchrony adaptation reveals neural population code for audio-visual timing
title_full_unstemmed Asynchrony adaptation reveals neural population code for audio-visual timing
title_short Asynchrony adaptation reveals neural population code for audio-visual timing
title_sort asynchrony adaptation reveals neural population code for audio-visual timing
topic Research Articles
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3061136/
https://www.ncbi.nlm.nih.gov/pubmed/20961905
http://dx.doi.org/10.1098/rspb.2010.1737
work_keys_str_mv AT roachneilw asynchronyadaptationrevealsneuralpopulationcodeforaudiovisualtiming
AT heronjames asynchronyadaptationrevealsneuralpopulationcodeforaudiovisualtiming
AT whitakerdavid asynchronyadaptationrevealsneuralpopulationcodeforaudiovisualtiming
AT mcgrawpaulv asynchronyadaptationrevealsneuralpopulationcodeforaudiovisualtiming