Cargando…
Neural Encoding of Active Multi-Sensing Enhances Perceptual Decision-Making via a Synergistic Cross-Modal Interaction
Most perceptual decisions rely on the active acquisition of evidence from the environment involving stimulation from multiple senses. However, our understanding of the neural mechanisms underlying this process is limited. Crucially, it remains elusive how different sensory representations interact i...
Autores principales: | , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Society for Neuroscience
2022
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8936614/ https://www.ncbi.nlm.nih.gov/pubmed/35091504 http://dx.doi.org/10.1523/JNEUROSCI.0861-21.2022 |
Sumario: | Most perceptual decisions rely on the active acquisition of evidence from the environment involving stimulation from multiple senses. However, our understanding of the neural mechanisms underlying this process is limited. Crucially, it remains elusive how different sensory representations interact in the formation of perceptual decisions. To answer these questions, we used an active sensing paradigm coupled with neuroimaging, multivariate analysis, and computational modeling to probe how the human brain processes multisensory information to make perceptual judgments. Participants of both sexes actively sensed to discriminate two texture stimuli using visual (V) or haptic (H) information or the two sensory cues together (VH). Crucially, information acquisition was under the participants' control, who could choose where to sample information from and for how long on each trial. To understand the neural underpinnings of this process, we first characterized where and when active sensory experience (movement patterns) is encoded in human brain activity (EEG) in the three sensory conditions. Then, to offer a neurocomputational account of active multisensory decision formation, we used these neural representations of active sensing to inform a drift diffusion model of decision-making behavior. This revealed a multisensory enhancement of the neural representation of active sensing, which led to faster and more accurate multisensory decisions. We then dissected the interactions between the V, H, and VH representations using a novel information-theoretic methodology. Ultimately, we identified a synergistic neural interaction between the two unisensory (V, H) representations over contralateral somatosensory and motor locations that predicted multisensory (VH) decision-making performance. SIGNIFICANCE STATEMENT In real-world settings, perceptual decisions are made during active behaviors, such as crossing the road on a rainy night, and include information from different senses (e.g., car lights, slippery ground). Critically, it remains largely unknown how sensory evidence is combined and translated into perceptual decisions in such active scenarios. Here we address this knowledge gap. First, we show that the simultaneous exploration of information across senses (multi-sensing) enhances the neural encoding of active sensing movements. Second, the neural representation of active sensing modulates the evidence available for decision; and importantly, multi-sensing yields faster evidence accumulation. Finally, we identify a cross-modal interaction in the human brain that correlates with multisensory performance, constituting a putative neural mechanism for forging active multisensory perception. |
---|