Cargando…

Reliability-Weighted Integration of Audiovisual Signals Can Be Modulated by Top-down Attention

Behaviorally, it is well established that human observers integrate signals near-optimally weighted in proportion to their reliabilities as predicted by maximum likelihood estimation. Yet, despite abundant behavioral evidence, it is unclear how the human brain accomplishes this feat. In a spatial ve...

Descripción completa

Detalles Bibliográficos
Autores principales: Rohe, Tim, Noppeney, Uta
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Society for Neuroscience 2018
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5844059/
https://www.ncbi.nlm.nih.gov/pubmed/29527567
http://dx.doi.org/10.1523/ENEURO.0315-17.2018
_version_ 1783305190088114176
author Rohe, Tim
Noppeney, Uta
author_facet Rohe, Tim
Noppeney, Uta
author_sort Rohe, Tim
collection PubMed
description Behaviorally, it is well established that human observers integrate signals near-optimally weighted in proportion to their reliabilities as predicted by maximum likelihood estimation. Yet, despite abundant behavioral evidence, it is unclear how the human brain accomplishes this feat. In a spatial ventriloquist paradigm, participants were presented with auditory, visual, and audiovisual signals and reported the location of the auditory or the visual signal. Combining psychophysics, multivariate functional MRI (fMRI) decoding, and models of maximum likelihood estimation (MLE), we characterized the computational operations underlying audiovisual integration at distinct cortical levels. We estimated observers’ behavioral weights by fitting psychometric functions to participants’ localization responses. Likewise, we estimated the neural weights by fitting neurometric functions to spatial locations decoded from regional fMRI activation patterns. Our results demonstrate that low-level auditory and visual areas encode predominantly the spatial location of the signal component of a region’s preferred auditory (or visual) modality. By contrast, intraparietal sulcus forms spatial representations by integrating auditory and visual signals weighted by their reliabilities. Critically, the neural and behavioral weights and the variance of the spatial representations depended not only on the sensory reliabilities as predicted by the MLE model but also on participants’ modality-specific attention and report (i.e., visual vs. auditory). These results suggest that audiovisual integration is not exclusively determined by bottom-up sensory reliabilities. Instead, modality-specific attention and report can flexibly modulate how intraparietal sulcus integrates sensory signals into spatial representations to guide behavioral responses (e.g., localization and orienting).
format Online
Article
Text
id pubmed-5844059
institution National Center for Biotechnology Information
language English
publishDate 2018
publisher Society for Neuroscience
record_format MEDLINE/PubMed
spelling pubmed-58440592018-03-09 Reliability-Weighted Integration of Audiovisual Signals Can Be Modulated by Top-down Attention Rohe, Tim Noppeney, Uta eNeuro New Research Behaviorally, it is well established that human observers integrate signals near-optimally weighted in proportion to their reliabilities as predicted by maximum likelihood estimation. Yet, despite abundant behavioral evidence, it is unclear how the human brain accomplishes this feat. In a spatial ventriloquist paradigm, participants were presented with auditory, visual, and audiovisual signals and reported the location of the auditory or the visual signal. Combining psychophysics, multivariate functional MRI (fMRI) decoding, and models of maximum likelihood estimation (MLE), we characterized the computational operations underlying audiovisual integration at distinct cortical levels. We estimated observers’ behavioral weights by fitting psychometric functions to participants’ localization responses. Likewise, we estimated the neural weights by fitting neurometric functions to spatial locations decoded from regional fMRI activation patterns. Our results demonstrate that low-level auditory and visual areas encode predominantly the spatial location of the signal component of a region’s preferred auditory (or visual) modality. By contrast, intraparietal sulcus forms spatial representations by integrating auditory and visual signals weighted by their reliabilities. Critically, the neural and behavioral weights and the variance of the spatial representations depended not only on the sensory reliabilities as predicted by the MLE model but also on participants’ modality-specific attention and report (i.e., visual vs. auditory). These results suggest that audiovisual integration is not exclusively determined by bottom-up sensory reliabilities. Instead, modality-specific attention and report can flexibly modulate how intraparietal sulcus integrates sensory signals into spatial representations to guide behavioral responses (e.g., localization and orienting). Society for Neuroscience 2018-03-08 /pmc/articles/PMC5844059/ /pubmed/29527567 http://dx.doi.org/10.1523/ENEURO.0315-17.2018 Text en Copyright © 2018 Rohe and Noppeney http://creativecommons.org/licenses/by/4.0/ This is an open-access article distributed under the terms of the Creative Commons Attribution 4.0 International license (http://creativecommons.org/licenses/by/4.0/) , which permits unrestricted use, distribution and reproduction in any medium provided that the original work is properly attributed.
spellingShingle New Research
Rohe, Tim
Noppeney, Uta
Reliability-Weighted Integration of Audiovisual Signals Can Be Modulated by Top-down Attention
title Reliability-Weighted Integration of Audiovisual Signals Can Be Modulated by Top-down Attention
title_full Reliability-Weighted Integration of Audiovisual Signals Can Be Modulated by Top-down Attention
title_fullStr Reliability-Weighted Integration of Audiovisual Signals Can Be Modulated by Top-down Attention
title_full_unstemmed Reliability-Weighted Integration of Audiovisual Signals Can Be Modulated by Top-down Attention
title_short Reliability-Weighted Integration of Audiovisual Signals Can Be Modulated by Top-down Attention
title_sort reliability-weighted integration of audiovisual signals can be modulated by top-down attention
topic New Research
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5844059/
https://www.ncbi.nlm.nih.gov/pubmed/29527567
http://dx.doi.org/10.1523/ENEURO.0315-17.2018
work_keys_str_mv AT rohetim reliabilityweightedintegrationofaudiovisualsignalscanbemodulatedbytopdownattention
AT noppeneyuta reliabilityweightedintegrationofaudiovisualsignalscanbemodulatedbytopdownattention