Cargando…

Cue Integration in Categorical Tasks: Insights from Audio-Visual Speech Perception

Previous cue integration studies have examined continuous perceptual dimensions (e.g., size) and have shown that human cue integration is well described by a normative model in which cues are weighted in proportion to their sensory reliability, as estimated from single-cue performance. However, this...

Descripción completa

Detalles Bibliográficos
Autores principales: Bejjanki, Vikranth Rao, Clayards, Meghan, Knill, David C., Aslin, Richard N.
Formato: Texto
Lenguaje:English
Publicado: Public Library of Science 2011
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3102664/
https://www.ncbi.nlm.nih.gov/pubmed/21637344
http://dx.doi.org/10.1371/journal.pone.0019812
_version_ 1782204404654407680
author Bejjanki, Vikranth Rao
Clayards, Meghan
Knill, David C.
Aslin, Richard N.
author_facet Bejjanki, Vikranth Rao
Clayards, Meghan
Knill, David C.
Aslin, Richard N.
author_sort Bejjanki, Vikranth Rao
collection PubMed
description Previous cue integration studies have examined continuous perceptual dimensions (e.g., size) and have shown that human cue integration is well described by a normative model in which cues are weighted in proportion to their sensory reliability, as estimated from single-cue performance. However, this normative model may not be applicable to categorical perceptual dimensions (e.g., phonemes). In tasks defined over categorical perceptual dimensions, optimal cue weights should depend not only on the sensory variance affecting the perception of each cue but also on the environmental variance inherent in each task-relevant category. Here, we present a computational and experimental investigation of cue integration in a categorical audio-visual (articulatory) speech perception task. Our results show that human performance during audio-visual phonemic labeling is qualitatively consistent with the behavior of a Bayes-optimal observer. Specifically, we show that the participants in our task are sensitive, on a trial-by-trial basis, to the sensory uncertainty associated with the auditory and visual cues, during phonemic categorization. In addition, we show that while sensory uncertainty is a significant factor in determining cue weights, it is not the only one and participants' performance is consistent with an optimal model in which environmental, within category variability also plays a role in determining cue weights. Furthermore, we show that in our task, the sensory variability affecting the visual modality during cue-combination is not well estimated from single-cue performance, but can be estimated from multi-cue performance. The findings and computational principles described here represent a principled first step towards characterizing the mechanisms underlying human cue integration in categorical tasks.
format Text
id pubmed-3102664
institution National Center for Biotechnology Information
language English
publishDate 2011
publisher Public Library of Science
record_format MEDLINE/PubMed
spelling pubmed-31026642011-06-02 Cue Integration in Categorical Tasks: Insights from Audio-Visual Speech Perception Bejjanki, Vikranth Rao Clayards, Meghan Knill, David C. Aslin, Richard N. PLoS One Research Article Previous cue integration studies have examined continuous perceptual dimensions (e.g., size) and have shown that human cue integration is well described by a normative model in which cues are weighted in proportion to their sensory reliability, as estimated from single-cue performance. However, this normative model may not be applicable to categorical perceptual dimensions (e.g., phonemes). In tasks defined over categorical perceptual dimensions, optimal cue weights should depend not only on the sensory variance affecting the perception of each cue but also on the environmental variance inherent in each task-relevant category. Here, we present a computational and experimental investigation of cue integration in a categorical audio-visual (articulatory) speech perception task. Our results show that human performance during audio-visual phonemic labeling is qualitatively consistent with the behavior of a Bayes-optimal observer. Specifically, we show that the participants in our task are sensitive, on a trial-by-trial basis, to the sensory uncertainty associated with the auditory and visual cues, during phonemic categorization. In addition, we show that while sensory uncertainty is a significant factor in determining cue weights, it is not the only one and participants' performance is consistent with an optimal model in which environmental, within category variability also plays a role in determining cue weights. Furthermore, we show that in our task, the sensory variability affecting the visual modality during cue-combination is not well estimated from single-cue performance, but can be estimated from multi-cue performance. The findings and computational principles described here represent a principled first step towards characterizing the mechanisms underlying human cue integration in categorical tasks. Public Library of Science 2011-05-26 /pmc/articles/PMC3102664/ /pubmed/21637344 http://dx.doi.org/10.1371/journal.pone.0019812 Text en Bejjanki et al. http://creativecommons.org/licenses/by/4.0/ This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are properly credited.
spellingShingle Research Article
Bejjanki, Vikranth Rao
Clayards, Meghan
Knill, David C.
Aslin, Richard N.
Cue Integration in Categorical Tasks: Insights from Audio-Visual Speech Perception
title Cue Integration in Categorical Tasks: Insights from Audio-Visual Speech Perception
title_full Cue Integration in Categorical Tasks: Insights from Audio-Visual Speech Perception
title_fullStr Cue Integration in Categorical Tasks: Insights from Audio-Visual Speech Perception
title_full_unstemmed Cue Integration in Categorical Tasks: Insights from Audio-Visual Speech Perception
title_short Cue Integration in Categorical Tasks: Insights from Audio-Visual Speech Perception
title_sort cue integration in categorical tasks: insights from audio-visual speech perception
topic Research Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3102664/
https://www.ncbi.nlm.nih.gov/pubmed/21637344
http://dx.doi.org/10.1371/journal.pone.0019812
work_keys_str_mv AT bejjankivikranthrao cueintegrationincategoricaltasksinsightsfromaudiovisualspeechperception
AT clayardsmeghan cueintegrationincategoricaltasksinsightsfromaudiovisualspeechperception
AT knilldavidc cueintegrationincategoricaltasksinsightsfromaudiovisualspeechperception
AT aslinrichardn cueintegrationincategoricaltasksinsightsfromaudiovisualspeechperception