Cargando…

The Timing of Vision – How Neural Processing Links to Different Temporal Dynamics

In this review, we describe our recent attempts to model the neural correlates of visual perception with biologically inspired networks of spiking neurons, emphasizing the dynamical aspects. Experimental evidence suggests distinct processing modes depending on the type of task the visual system is e...

Descripción completa

Detalles Bibliográficos
Autores principales: Masquelier, Timothée, Albantakis, Larissa, Deco, Gustavo
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Frontiers Research Foundation 2011
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3129241/
https://www.ncbi.nlm.nih.gov/pubmed/21747774
http://dx.doi.org/10.3389/fpsyg.2011.00151
_version_ 1782207526508429312
author Masquelier, Timothée
Albantakis, Larissa
Deco, Gustavo
author_facet Masquelier, Timothée
Albantakis, Larissa
Deco, Gustavo
author_sort Masquelier, Timothée
collection PubMed
description In this review, we describe our recent attempts to model the neural correlates of visual perception with biologically inspired networks of spiking neurons, emphasizing the dynamical aspects. Experimental evidence suggests distinct processing modes depending on the type of task the visual system is engaged in. A first mode, crucial for object recognition, deals with rapidly extracting the glimpse of a visual scene in the first 100 ms after its presentation. The promptness of this process points to mainly feedforward processing, which relies on latency coding, and may be shaped by spike timing-dependent plasticity (STDP). Our simulations confirm the plausibility and efficiency of such a scheme. A second mode can be engaged whenever one needs to perform finer perceptual discrimination through evidence accumulation on the order of 400 ms and above. Here, our simulations, together with theoretical considerations, show how predominantly local recurrent connections and long neural time-constants enable the integration and build-up of firing rates on this timescale. In particular, we review how a non-linear model with attractor states induced by strong recurrent connectivity provides straightforward explanations for several recent experimental observations. A third mode, involving additional top-down attentional signals, is relevant for more complex visual scene processing. In the model, as in the brain, these top-down attentional signals shape visual processing by biasing the competition between different pools of neurons. The winning pools may not only have a higher firing rate, but also more synchronous oscillatory activity. This fourth mode, oscillatory activity, leads to faster reaction times and enhanced information transfers in the model. This has indeed been observed experimentally. Moreover, oscillatory activity can format spike times and encode information in the spike phases with respect to the oscillatory cycle. This phenomenon is referred to as “phase-of-firing coding,” and experimental evidence for it is accumulating in the visual system. Simulations show that this code can again be efficiently decoded by STDP. Future work should focus on continuous natural vision, bio-inspired hardware vision systems, and novel experimental paradigms to further distinguish current modeling approaches.
format Online
Article
Text
id pubmed-3129241
institution National Center for Biotechnology Information
language English
publishDate 2011
publisher Frontiers Research Foundation
record_format MEDLINE/PubMed
spelling pubmed-31292412011-07-11 The Timing of Vision – How Neural Processing Links to Different Temporal Dynamics Masquelier, Timothée Albantakis, Larissa Deco, Gustavo Front Psychol Psychology In this review, we describe our recent attempts to model the neural correlates of visual perception with biologically inspired networks of spiking neurons, emphasizing the dynamical aspects. Experimental evidence suggests distinct processing modes depending on the type of task the visual system is engaged in. A first mode, crucial for object recognition, deals with rapidly extracting the glimpse of a visual scene in the first 100 ms after its presentation. The promptness of this process points to mainly feedforward processing, which relies on latency coding, and may be shaped by spike timing-dependent plasticity (STDP). Our simulations confirm the plausibility and efficiency of such a scheme. A second mode can be engaged whenever one needs to perform finer perceptual discrimination through evidence accumulation on the order of 400 ms and above. Here, our simulations, together with theoretical considerations, show how predominantly local recurrent connections and long neural time-constants enable the integration and build-up of firing rates on this timescale. In particular, we review how a non-linear model with attractor states induced by strong recurrent connectivity provides straightforward explanations for several recent experimental observations. A third mode, involving additional top-down attentional signals, is relevant for more complex visual scene processing. In the model, as in the brain, these top-down attentional signals shape visual processing by biasing the competition between different pools of neurons. The winning pools may not only have a higher firing rate, but also more synchronous oscillatory activity. This fourth mode, oscillatory activity, leads to faster reaction times and enhanced information transfers in the model. This has indeed been observed experimentally. Moreover, oscillatory activity can format spike times and encode information in the spike phases with respect to the oscillatory cycle. This phenomenon is referred to as “phase-of-firing coding,” and experimental evidence for it is accumulating in the visual system. Simulations show that this code can again be efficiently decoded by STDP. Future work should focus on continuous natural vision, bio-inspired hardware vision systems, and novel experimental paradigms to further distinguish current modeling approaches. Frontiers Research Foundation 2011-06-30 /pmc/articles/PMC3129241/ /pubmed/21747774 http://dx.doi.org/10.3389/fpsyg.2011.00151 Text en Copyright © 2011 Masquelier, Albantakis and Deco. http://www.frontiersin.org/licenseagreement This is an open-access article subject to a non-exclusive license between the authors and Frontiers Media SA, which permits use, distribution and reproduction in other forums, provided the original authors and source are credited and other Frontiers conditions are complied with.
spellingShingle Psychology
Masquelier, Timothée
Albantakis, Larissa
Deco, Gustavo
The Timing of Vision – How Neural Processing Links to Different Temporal Dynamics
title The Timing of Vision – How Neural Processing Links to Different Temporal Dynamics
title_full The Timing of Vision – How Neural Processing Links to Different Temporal Dynamics
title_fullStr The Timing of Vision – How Neural Processing Links to Different Temporal Dynamics
title_full_unstemmed The Timing of Vision – How Neural Processing Links to Different Temporal Dynamics
title_short The Timing of Vision – How Neural Processing Links to Different Temporal Dynamics
title_sort timing of vision – how neural processing links to different temporal dynamics
topic Psychology
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3129241/
https://www.ncbi.nlm.nih.gov/pubmed/21747774
http://dx.doi.org/10.3389/fpsyg.2011.00151
work_keys_str_mv AT masqueliertimothee thetimingofvisionhowneuralprocessinglinkstodifferenttemporaldynamics
AT albantakislarissa thetimingofvisionhowneuralprocessinglinkstodifferenttemporaldynamics
AT decogustavo thetimingofvisionhowneuralprocessinglinkstodifferenttemporaldynamics
AT masqueliertimothee timingofvisionhowneuralprocessinglinkstodifferenttemporaldynamics
AT albantakislarissa timingofvisionhowneuralprocessinglinkstodifferenttemporaldynamics
AT decogustavo timingofvisionhowneuralprocessinglinkstodifferenttemporaldynamics