Cargando…

Synaptic encoding of temporal contiguity

Often we need to perform tasks in an environment that changes stochastically. In these situations it is important to learn the statistics of sequences of events in order to predict the future and the outcome of our actions. The statistical description of many of these sequences can be reduced to the...

Descripción completa

Detalles Bibliográficos
Autores principales: Ostojic, Srdjan, Fusi, Stefano
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Frontiers Media S.A. 2013
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3640208/
https://www.ncbi.nlm.nih.gov/pubmed/23641210
http://dx.doi.org/10.3389/fncom.2013.00032
_version_ 1782267904533725184
author Ostojic, Srdjan
Fusi, Stefano
author_facet Ostojic, Srdjan
Fusi, Stefano
author_sort Ostojic, Srdjan
collection PubMed
description Often we need to perform tasks in an environment that changes stochastically. In these situations it is important to learn the statistics of sequences of events in order to predict the future and the outcome of our actions. The statistical description of many of these sequences can be reduced to the set of probabilities that a particular event follows another event (temporal contiguity). Under these conditions, it is important to encode and store in our memory these transition probabilities. Here we show that for a large class of synaptic plasticity models, the distribution of synaptic strengths encodes transitions probabilities. Specifically, when the synaptic dynamics depend on pairs of contiguous events and the synapses can remember multiple instances of the transitions, then the average synaptic weights are a monotonic function of the transition probabilities. The synaptic weights converge to the distribution encoding the probabilities also when the correlations between consecutive synaptic modifications are considered. We studied how this distribution depends on the number of synaptic states for a specific model of a multi-state synapse with hard bounds. In the case of bistable synapses, the average synaptic weights are a smooth function of the transition probabilities and the accuracy of the encoding depends on the learning rate. As the number of synaptic states increases, the average synaptic weights become a step function of the transition probabilities. We finally show that the information stored in the synaptic weights can be read out by a simple rate-based neural network. Our study shows that synapses encode transition probabilities under general assumptions and this indicates that temporal contiguity is likely to be encoded and harnessed in almost every neural circuit in the brain.
format Online
Article
Text
id pubmed-3640208
institution National Center for Biotechnology Information
language English
publishDate 2013
publisher Frontiers Media S.A.
record_format MEDLINE/PubMed
spelling pubmed-36402082013-05-02 Synaptic encoding of temporal contiguity Ostojic, Srdjan Fusi, Stefano Front Comput Neurosci Neuroscience Often we need to perform tasks in an environment that changes stochastically. In these situations it is important to learn the statistics of sequences of events in order to predict the future and the outcome of our actions. The statistical description of many of these sequences can be reduced to the set of probabilities that a particular event follows another event (temporal contiguity). Under these conditions, it is important to encode and store in our memory these transition probabilities. Here we show that for a large class of synaptic plasticity models, the distribution of synaptic strengths encodes transitions probabilities. Specifically, when the synaptic dynamics depend on pairs of contiguous events and the synapses can remember multiple instances of the transitions, then the average synaptic weights are a monotonic function of the transition probabilities. The synaptic weights converge to the distribution encoding the probabilities also when the correlations between consecutive synaptic modifications are considered. We studied how this distribution depends on the number of synaptic states for a specific model of a multi-state synapse with hard bounds. In the case of bistable synapses, the average synaptic weights are a smooth function of the transition probabilities and the accuracy of the encoding depends on the learning rate. As the number of synaptic states increases, the average synaptic weights become a step function of the transition probabilities. We finally show that the information stored in the synaptic weights can be read out by a simple rate-based neural network. Our study shows that synapses encode transition probabilities under general assumptions and this indicates that temporal contiguity is likely to be encoded and harnessed in almost every neural circuit in the brain. Frontiers Media S.A. 2013-04-12 /pmc/articles/PMC3640208/ /pubmed/23641210 http://dx.doi.org/10.3389/fncom.2013.00032 Text en Copyright © 2013 Ostojic and Fusi. http://creativecommons.org/licenses/by/3.0/ This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits use, distribution and reproduction in other forums, provided the original authors and source are credited and subject to any copyright notices concerning any third-party graphics etc.
spellingShingle Neuroscience
Ostojic, Srdjan
Fusi, Stefano
Synaptic encoding of temporal contiguity
title Synaptic encoding of temporal contiguity
title_full Synaptic encoding of temporal contiguity
title_fullStr Synaptic encoding of temporal contiguity
title_full_unstemmed Synaptic encoding of temporal contiguity
title_short Synaptic encoding of temporal contiguity
title_sort synaptic encoding of temporal contiguity
topic Neuroscience
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3640208/
https://www.ncbi.nlm.nih.gov/pubmed/23641210
http://dx.doi.org/10.3389/fncom.2013.00032
work_keys_str_mv AT ostojicsrdjan synapticencodingoftemporalcontiguity
AT fusistefano synapticencodingoftemporalcontiguity