Cargando…

A dynamical framework to relate perceptual variability with multisensory information processing

Multisensory processing involves participation of individual sensory streams, e.g., vision, audition to facilitate perception of environmental stimuli. An experimental realization of the underlying complexity is captured by the “McGurk-effect”- incongruent auditory and visual vocalization stimuli el...

Descripción completa

Detalles Bibliográficos
Autores principales: Thakur, Bhumika, Mukherjee, Abhishek, Sen, Abhijit, Banerjee, Arpan
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Nature Publishing Group 2016
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4977493/
https://www.ncbi.nlm.nih.gov/pubmed/27502974
http://dx.doi.org/10.1038/srep31280
_version_ 1782447037489348608
author Thakur, Bhumika
Mukherjee, Abhishek
Sen, Abhijit
Banerjee, Arpan
author_facet Thakur, Bhumika
Mukherjee, Abhishek
Sen, Abhijit
Banerjee, Arpan
author_sort Thakur, Bhumika
collection PubMed
description Multisensory processing involves participation of individual sensory streams, e.g., vision, audition to facilitate perception of environmental stimuli. An experimental realization of the underlying complexity is captured by the “McGurk-effect”- incongruent auditory and visual vocalization stimuli eliciting perception of illusory speech sounds. Further studies have established that time-delay between onset of auditory and visual signals (AV lag) and perturbations in the unisensory streams are key variables that modulate perception. However, as of now only few quantitative theoretical frameworks have been proposed to understand the interplay among these psychophysical variables or the neural systems level interactions that govern perceptual variability. Here, we propose a dynamic systems model consisting of the basic ingredients of any multisensory processing, two unisensory and one multisensory sub-system (nodes) as reported by several researchers. The nodes are connected such that biophysically inspired coupling parameters and time delays become key parameters of this network. We observed that zero AV lag results in maximum synchronization of constituent nodes and the degree of synchronization decreases when we have non-zero lags. The attractor states of this network can thus be interpreted as the facilitator for stabilizing specific perceptual experience. Thereby, the dynamic model presents a quantitative framework for understanding multisensory information processing.
format Online
Article
Text
id pubmed-4977493
institution National Center for Biotechnology Information
language English
publishDate 2016
publisher Nature Publishing Group
record_format MEDLINE/PubMed
spelling pubmed-49774932016-08-22 A dynamical framework to relate perceptual variability with multisensory information processing Thakur, Bhumika Mukherjee, Abhishek Sen, Abhijit Banerjee, Arpan Sci Rep Article Multisensory processing involves participation of individual sensory streams, e.g., vision, audition to facilitate perception of environmental stimuli. An experimental realization of the underlying complexity is captured by the “McGurk-effect”- incongruent auditory and visual vocalization stimuli eliciting perception of illusory speech sounds. Further studies have established that time-delay between onset of auditory and visual signals (AV lag) and perturbations in the unisensory streams are key variables that modulate perception. However, as of now only few quantitative theoretical frameworks have been proposed to understand the interplay among these psychophysical variables or the neural systems level interactions that govern perceptual variability. Here, we propose a dynamic systems model consisting of the basic ingredients of any multisensory processing, two unisensory and one multisensory sub-system (nodes) as reported by several researchers. The nodes are connected such that biophysically inspired coupling parameters and time delays become key parameters of this network. We observed that zero AV lag results in maximum synchronization of constituent nodes and the degree of synchronization decreases when we have non-zero lags. The attractor states of this network can thus be interpreted as the facilitator for stabilizing specific perceptual experience. Thereby, the dynamic model presents a quantitative framework for understanding multisensory information processing. Nature Publishing Group 2016-08-09 /pmc/articles/PMC4977493/ /pubmed/27502974 http://dx.doi.org/10.1038/srep31280 Text en Copyright © 2016, The Author(s) http://creativecommons.org/licenses/by/4.0/ This work is licensed under a Creative Commons Attribution 4.0 International License. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in the credit line; if the material is not included under the Creative Commons license, users will need to obtain permission from the license holder to reproduce the material. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/
spellingShingle Article
Thakur, Bhumika
Mukherjee, Abhishek
Sen, Abhijit
Banerjee, Arpan
A dynamical framework to relate perceptual variability with multisensory information processing
title A dynamical framework to relate perceptual variability with multisensory information processing
title_full A dynamical framework to relate perceptual variability with multisensory information processing
title_fullStr A dynamical framework to relate perceptual variability with multisensory information processing
title_full_unstemmed A dynamical framework to relate perceptual variability with multisensory information processing
title_short A dynamical framework to relate perceptual variability with multisensory information processing
title_sort dynamical framework to relate perceptual variability with multisensory information processing
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4977493/
https://www.ncbi.nlm.nih.gov/pubmed/27502974
http://dx.doi.org/10.1038/srep31280
work_keys_str_mv AT thakurbhumika adynamicalframeworktorelateperceptualvariabilitywithmultisensoryinformationprocessing
AT mukherjeeabhishek adynamicalframeworktorelateperceptualvariabilitywithmultisensoryinformationprocessing
AT senabhijit adynamicalframeworktorelateperceptualvariabilitywithmultisensoryinformationprocessing
AT banerjeearpan adynamicalframeworktorelateperceptualvariabilitywithmultisensoryinformationprocessing
AT thakurbhumika dynamicalframeworktorelateperceptualvariabilitywithmultisensoryinformationprocessing
AT mukherjeeabhishek dynamicalframeworktorelateperceptualvariabilitywithmultisensoryinformationprocessing
AT senabhijit dynamicalframeworktorelateperceptualvariabilitywithmultisensoryinformationprocessing
AT banerjeearpan dynamicalframeworktorelateperceptualvariabilitywithmultisensoryinformationprocessing