Cargando…
Visual and Auditory Components in the Perception of Asynchronous Audiovisual Speech
Research on asynchronous audiovisual speech perception manipulates experimental conditions to observe their effects on synchrony judgments. Probabilistic models establish a link between the sensory and decisional processes underlying such judgments and the observed data, via interpretable parameters...
Autores principales: | , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
SAGE Publications
2015
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4975115/ https://www.ncbi.nlm.nih.gov/pubmed/27551361 http://dx.doi.org/10.1177/2041669515615735 |
_version_ | 1782446660562976768 |
---|---|
author | García-Pérez, Miguel A. Alcalá-Quintana, Rocío |
author_facet | García-Pérez, Miguel A. Alcalá-Quintana, Rocío |
author_sort | García-Pérez, Miguel A. |
collection | PubMed |
description | Research on asynchronous audiovisual speech perception manipulates experimental conditions to observe their effects on synchrony judgments. Probabilistic models establish a link between the sensory and decisional processes underlying such judgments and the observed data, via interpretable parameters that allow testing hypotheses and making inferences about how experimental manipulations affect such processes. Two models of this type have recently been proposed, one based on independent channels and the other using a Bayesian approach. Both models are fitted here to a common data set, with a subsequent analysis of the interpretation they provide about how experimental manipulations affected the processes underlying perceived synchrony. The data consist of synchrony judgments as a function of audiovisual offset in a speech stimulus, under four within-subjects manipulations of the quality of the visual component. The Bayesian model could not accommodate asymmetric data, was rejected by goodness-of-fit statistics for 8/16 observers, and was found to be nonidentifiable, which renders uninterpretable parameter estimates. The independent-channels model captured asymmetric data, was rejected for only 1/16 observers, and identified how sensory and decisional processes mediating asynchronous audiovisual speech perception are affected by manipulations that only alter the quality of the visual component of the speech signal. |
format | Online Article Text |
id | pubmed-4975115 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2015 |
publisher | SAGE Publications |
record_format | MEDLINE/PubMed |
spelling | pubmed-49751152016-08-22 Visual and Auditory Components in the Perception of Asynchronous Audiovisual Speech García-Pérez, Miguel A. Alcalá-Quintana, Rocío Iperception Article Research on asynchronous audiovisual speech perception manipulates experimental conditions to observe their effects on synchrony judgments. Probabilistic models establish a link between the sensory and decisional processes underlying such judgments and the observed data, via interpretable parameters that allow testing hypotheses and making inferences about how experimental manipulations affect such processes. Two models of this type have recently been proposed, one based on independent channels and the other using a Bayesian approach. Both models are fitted here to a common data set, with a subsequent analysis of the interpretation they provide about how experimental manipulations affected the processes underlying perceived synchrony. The data consist of synchrony judgments as a function of audiovisual offset in a speech stimulus, under four within-subjects manipulations of the quality of the visual component. The Bayesian model could not accommodate asymmetric data, was rejected by goodness-of-fit statistics for 8/16 observers, and was found to be nonidentifiable, which renders uninterpretable parameter estimates. The independent-channels model captured asymmetric data, was rejected for only 1/16 observers, and identified how sensory and decisional processes mediating asynchronous audiovisual speech perception are affected by manipulations that only alter the quality of the visual component of the speech signal. SAGE Publications 2015-11-30 /pmc/articles/PMC4975115/ /pubmed/27551361 http://dx.doi.org/10.1177/2041669515615735 Text en © The Author(s) 2015 http://creativecommons.org/licenses/by/3.0/ This article is distributed under the terms of the Creative Commons Attribution 3.0 License (http://www.creativecommons.org/licenses/by/3.0/) which permits any use, reproduction and distribution of the work without further permission provided the original work is attributed as specified on the SAGE and Open Access page (https://us.sagepub.com/en-us/nam/open-access-at-sage). |
spellingShingle | Article García-Pérez, Miguel A. Alcalá-Quintana, Rocío Visual and Auditory Components in the Perception of Asynchronous Audiovisual Speech |
title | Visual and Auditory Components in the Perception of Asynchronous Audiovisual Speech |
title_full | Visual and Auditory Components in the Perception of Asynchronous Audiovisual Speech |
title_fullStr | Visual and Auditory Components in the Perception of Asynchronous Audiovisual Speech |
title_full_unstemmed | Visual and Auditory Components in the Perception of Asynchronous Audiovisual Speech |
title_short | Visual and Auditory Components in the Perception of Asynchronous Audiovisual Speech |
title_sort | visual and auditory components in the perception of asynchronous audiovisual speech |
topic | Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4975115/ https://www.ncbi.nlm.nih.gov/pubmed/27551361 http://dx.doi.org/10.1177/2041669515615735 |
work_keys_str_mv | AT garciaperezmiguela visualandauditorycomponentsintheperceptionofasynchronousaudiovisualspeech AT alcalaquintanarocio visualandauditorycomponentsintheperceptionofasynchronousaudiovisualspeech |