Cargando…

Sight and sound out of synch: Fragmentation and renormalisation of audiovisual integration and subjective timing

The sight and sound of a person speaking or a ball bouncing may seem simultaneous, but their corresponding neural signals are spread out over time as they arrive at different multisensory brain sites. How subjective timing relates to such neural timing remains a fundamental neuroscientific and philo...

Descripción completa

Detalles Bibliográficos
Autores principales: Freeman, Elliot D., Ipser, Alberta, Palmbaha, Austra, Paunoiu, Diana, Brown, Peter, Lambert, Christian, Leff, Alex, Driver, Jon
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Masson 2013
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3878386/
https://www.ncbi.nlm.nih.gov/pubmed/23664001
http://dx.doi.org/10.1016/j.cortex.2013.03.006
_version_ 1782297796529881088
author Freeman, Elliot D.
Ipser, Alberta
Palmbaha, Austra
Paunoiu, Diana
Brown, Peter
Lambert, Christian
Leff, Alex
Driver, Jon
author_facet Freeman, Elliot D.
Ipser, Alberta
Palmbaha, Austra
Paunoiu, Diana
Brown, Peter
Lambert, Christian
Leff, Alex
Driver, Jon
author_sort Freeman, Elliot D.
collection PubMed
description The sight and sound of a person speaking or a ball bouncing may seem simultaneous, but their corresponding neural signals are spread out over time as they arrive at different multisensory brain sites. How subjective timing relates to such neural timing remains a fundamental neuroscientific and philosophical puzzle. A dominant assumption is that temporal coherence is achieved by sensory resynchronisation or recalibration across asynchronous brain events. This assumption is easily confirmed by estimating subjective audiovisual timing for groups of subjects, which is on average similar across different measures and stimuli, and approximately veridical. But few studies have examined normal and pathological individual differences in such measures. Case PH, with lesions in pons and basal ganglia, hears people speak before seeing their lips move. Temporal order judgements (TOJs) confirmed this: voices had to lag lip-movements (by ∼200 msec) to seem synchronous to PH. Curiously, voices had to lead lips (also by ∼200 msec) to maximise the McGurk illusion (a measure of audiovisual speech integration). On average across these measures, PH's timing was therefore still veridical. Age-matched control participants showed similar discrepancies. Indeed, normal individual differences in TOJ and McGurk timing correlated negatively: subjects needing an auditory lag for subjective simultaneity needed an auditory lead for maximal McGurk, and vice versa. This generalised to the Stream–Bounce illusion. Such surprising antagonism seems opposed to good sensory resynchronisation, yet average timing across tasks was still near-veridical. Our findings reveal remarkable disunity of audiovisual timing within and between subjects. To explain this we propose that the timing of audiovisual signals within different brain mechanisms is perceived relative to the average timing across mechanisms. Such renormalisation fully explains the curious antagonistic relationship between disparate timing estimates in PH and healthy participants, and how they can still perceive the timing of external events correctly, on average.
format Online
Article
Text
id pubmed-3878386
institution National Center for Biotechnology Information
language English
publishDate 2013
publisher Masson
record_format MEDLINE/PubMed
spelling pubmed-38783862014-01-02 Sight and sound out of synch: Fragmentation and renormalisation of audiovisual integration and subjective timing Freeman, Elliot D. Ipser, Alberta Palmbaha, Austra Paunoiu, Diana Brown, Peter Lambert, Christian Leff, Alex Driver, Jon Cortex Research Report The sight and sound of a person speaking or a ball bouncing may seem simultaneous, but their corresponding neural signals are spread out over time as they arrive at different multisensory brain sites. How subjective timing relates to such neural timing remains a fundamental neuroscientific and philosophical puzzle. A dominant assumption is that temporal coherence is achieved by sensory resynchronisation or recalibration across asynchronous brain events. This assumption is easily confirmed by estimating subjective audiovisual timing for groups of subjects, which is on average similar across different measures and stimuli, and approximately veridical. But few studies have examined normal and pathological individual differences in such measures. Case PH, with lesions in pons and basal ganglia, hears people speak before seeing their lips move. Temporal order judgements (TOJs) confirmed this: voices had to lag lip-movements (by ∼200 msec) to seem synchronous to PH. Curiously, voices had to lead lips (also by ∼200 msec) to maximise the McGurk illusion (a measure of audiovisual speech integration). On average across these measures, PH's timing was therefore still veridical. Age-matched control participants showed similar discrepancies. Indeed, normal individual differences in TOJ and McGurk timing correlated negatively: subjects needing an auditory lag for subjective simultaneity needed an auditory lead for maximal McGurk, and vice versa. This generalised to the Stream–Bounce illusion. Such surprising antagonism seems opposed to good sensory resynchronisation, yet average timing across tasks was still near-veridical. Our findings reveal remarkable disunity of audiovisual timing within and between subjects. To explain this we propose that the timing of audiovisual signals within different brain mechanisms is perceived relative to the average timing across mechanisms. Such renormalisation fully explains the curious antagonistic relationship between disparate timing estimates in PH and healthy participants, and how they can still perceive the timing of external events correctly, on average. Masson 2013-11 /pmc/articles/PMC3878386/ /pubmed/23664001 http://dx.doi.org/10.1016/j.cortex.2013.03.006 Text en © 2013 Elsevier Srl. All rights reserved. https://creativecommons.org/licenses/by/3.0/ Open Access under CC BY 3.0 (https://creativecommons.org/licenses/by/3.0/) license
spellingShingle Research Report
Freeman, Elliot D.
Ipser, Alberta
Palmbaha, Austra
Paunoiu, Diana
Brown, Peter
Lambert, Christian
Leff, Alex
Driver, Jon
Sight and sound out of synch: Fragmentation and renormalisation of audiovisual integration and subjective timing
title Sight and sound out of synch: Fragmentation and renormalisation of audiovisual integration and subjective timing
title_full Sight and sound out of synch: Fragmentation and renormalisation of audiovisual integration and subjective timing
title_fullStr Sight and sound out of synch: Fragmentation and renormalisation of audiovisual integration and subjective timing
title_full_unstemmed Sight and sound out of synch: Fragmentation and renormalisation of audiovisual integration and subjective timing
title_short Sight and sound out of synch: Fragmentation and renormalisation of audiovisual integration and subjective timing
title_sort sight and sound out of synch: fragmentation and renormalisation of audiovisual integration and subjective timing
topic Research Report
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3878386/
https://www.ncbi.nlm.nih.gov/pubmed/23664001
http://dx.doi.org/10.1016/j.cortex.2013.03.006
work_keys_str_mv AT freemanelliotd sightandsoundoutofsynchfragmentationandrenormalisationofaudiovisualintegrationandsubjectivetiming
AT ipseralberta sightandsoundoutofsynchfragmentationandrenormalisationofaudiovisualintegrationandsubjectivetiming
AT palmbahaaustra sightandsoundoutofsynchfragmentationandrenormalisationofaudiovisualintegrationandsubjectivetiming
AT paunoiudiana sightandsoundoutofsynchfragmentationandrenormalisationofaudiovisualintegrationandsubjectivetiming
AT brownpeter sightandsoundoutofsynchfragmentationandrenormalisationofaudiovisualintegrationandsubjectivetiming
AT lambertchristian sightandsoundoutofsynchfragmentationandrenormalisationofaudiovisualintegrationandsubjectivetiming
AT leffalex sightandsoundoutofsynchfragmentationandrenormalisationofaudiovisualintegrationandsubjectivetiming
AT driverjon sightandsoundoutofsynchfragmentationandrenormalisationofaudiovisualintegrationandsubjectivetiming