Cargando…

What can crossmodal aftereffects reveal about neural representation and dynamics?

The brain continuously adapts to incoming sensory stimuli, which can lead to perceptual illusions in the form of aftereffects. Recently we demonstrated that motion aftereffects transfer between vision and touch.(1) Here, the adapted brain state induced by one modality has consequences for processes...

Descripción completa

Detalles Bibliográficos
Autores principales: Konkle, Talia, Moore, Christopher I.
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Landes Bioscience 2009
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3398893/
https://www.ncbi.nlm.nih.gov/pubmed/22811763
http://dx.doi.org/10.4161/cib.2.6.9344
Descripción
Sumario:The brain continuously adapts to incoming sensory stimuli, which can lead to perceptual illusions in the form of aftereffects. Recently we demonstrated that motion aftereffects transfer between vision and touch.(1) Here, the adapted brain state induced by one modality has consequences for processes in another modality, implying that somewhere in the processing stream, visual and tactile motion have shared underlying neural representations. We propose the adaptive processing hypothesis—any area that processes a stimulus adapts to the features of the stimulus it represents, and this adaptation has consequences for perception. This view argues that there is no single locus of an aftereffect. Rather, aftereffects emerge when the test stimulus used to probe the effect of adaptation requires processing of a given type. The illusion will reflect the properties of the brain area(s) that support that specific level of representation. We further suggest that many cortical areas are more process-dependent than modality-dependent, with crossmodal interactions reflecting shared processing demands in even ‘early’ sensory cortices.