Cargando…

General differential Hebbian learning: Capturing temporal relations between events in neural networks and the brain

Learning in biologically relevant neural-network models usually relies on Hebb learning rules. The typical implementations of these rules change the synaptic strength on the basis of the co-occurrence of the neural events taking place at a certain time in the pre- and post-synaptic neurons. Differen...

Descripción completa

Detalles Bibliográficos
Autores principales: Zappacosta, Stefano, Mannella, Francesco, Mirolli, Marco, Baldassarre, Gianluca
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Public Library of Science 2018
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6130884/
https://www.ncbi.nlm.nih.gov/pubmed/30153263
http://dx.doi.org/10.1371/journal.pcbi.1006227
_version_ 1783354025578594304
author Zappacosta, Stefano
Mannella, Francesco
Mirolli, Marco
Baldassarre, Gianluca
author_facet Zappacosta, Stefano
Mannella, Francesco
Mirolli, Marco
Baldassarre, Gianluca
author_sort Zappacosta, Stefano
collection PubMed
description Learning in biologically relevant neural-network models usually relies on Hebb learning rules. The typical implementations of these rules change the synaptic strength on the basis of the co-occurrence of the neural events taking place at a certain time in the pre- and post-synaptic neurons. Differential Hebbian learning (DHL) rules, instead, are able to update the synapse by taking into account the temporal relation, captured with derivatives, between the neural events happening in the recent past. The few DHL rules proposed so far can update the synaptic weights only in few ways: this is a limitation for the study of dynamical neurons and neural-network models. Moreover, empirical evidence on brain spike-timing-dependent plasticity (STDP) shows that different neurons express a surprisingly rich repertoire of different learning processes going far beyond existing DHL rules. This opens up a second problem of how capturing such processes with DHL rules. Here we propose a general DHL (G-DHL) rule generating the existing rules and many others. The rule has a high expressiveness as it combines in different ways the pre- and post-synaptic neuron signals and derivatives. The rule flexibility is shown by applying it to various signals of artificial neurons and by fitting several different STDP experimental data sets. To these purposes, we propose techniques to pre-process the neural signals and capture the temporal relations between the neural events of interest. We also propose a procedure to automatically identify the rule components and parameters that best fit different STDP data sets, and show how the identified components might be used to heuristically guide the search of the biophysical mechanisms underlying STDP. Overall, the results show that the G-DHL rule represents a useful means to study time-sensitive learning processes in both artificial neural networks and brain.
format Online
Article
Text
id pubmed-6130884
institution National Center for Biotechnology Information
language English
publishDate 2018
publisher Public Library of Science
record_format MEDLINE/PubMed
spelling pubmed-61308842018-09-17 General differential Hebbian learning: Capturing temporal relations between events in neural networks and the brain Zappacosta, Stefano Mannella, Francesco Mirolli, Marco Baldassarre, Gianluca PLoS Comput Biol Research Article Learning in biologically relevant neural-network models usually relies on Hebb learning rules. The typical implementations of these rules change the synaptic strength on the basis of the co-occurrence of the neural events taking place at a certain time in the pre- and post-synaptic neurons. Differential Hebbian learning (DHL) rules, instead, are able to update the synapse by taking into account the temporal relation, captured with derivatives, between the neural events happening in the recent past. The few DHL rules proposed so far can update the synaptic weights only in few ways: this is a limitation for the study of dynamical neurons and neural-network models. Moreover, empirical evidence on brain spike-timing-dependent plasticity (STDP) shows that different neurons express a surprisingly rich repertoire of different learning processes going far beyond existing DHL rules. This opens up a second problem of how capturing such processes with DHL rules. Here we propose a general DHL (G-DHL) rule generating the existing rules and many others. The rule has a high expressiveness as it combines in different ways the pre- and post-synaptic neuron signals and derivatives. The rule flexibility is shown by applying it to various signals of artificial neurons and by fitting several different STDP experimental data sets. To these purposes, we propose techniques to pre-process the neural signals and capture the temporal relations between the neural events of interest. We also propose a procedure to automatically identify the rule components and parameters that best fit different STDP data sets, and show how the identified components might be used to heuristically guide the search of the biophysical mechanisms underlying STDP. Overall, the results show that the G-DHL rule represents a useful means to study time-sensitive learning processes in both artificial neural networks and brain. Public Library of Science 2018-08-28 /pmc/articles/PMC6130884/ /pubmed/30153263 http://dx.doi.org/10.1371/journal.pcbi.1006227 Text en © 2018 Zappacosta et al http://creativecommons.org/licenses/by/4.0/ This is an open access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0/) , which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
spellingShingle Research Article
Zappacosta, Stefano
Mannella, Francesco
Mirolli, Marco
Baldassarre, Gianluca
General differential Hebbian learning: Capturing temporal relations between events in neural networks and the brain
title General differential Hebbian learning: Capturing temporal relations between events in neural networks and the brain
title_full General differential Hebbian learning: Capturing temporal relations between events in neural networks and the brain
title_fullStr General differential Hebbian learning: Capturing temporal relations between events in neural networks and the brain
title_full_unstemmed General differential Hebbian learning: Capturing temporal relations between events in neural networks and the brain
title_short General differential Hebbian learning: Capturing temporal relations between events in neural networks and the brain
title_sort general differential hebbian learning: capturing temporal relations between events in neural networks and the brain
topic Research Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6130884/
https://www.ncbi.nlm.nih.gov/pubmed/30153263
http://dx.doi.org/10.1371/journal.pcbi.1006227
work_keys_str_mv AT zappacostastefano generaldifferentialhebbianlearningcapturingtemporalrelationsbetweeneventsinneuralnetworksandthebrain
AT mannellafrancesco generaldifferentialhebbianlearningcapturingtemporalrelationsbetweeneventsinneuralnetworksandthebrain
AT mirollimarco generaldifferentialhebbianlearningcapturingtemporalrelationsbetweeneventsinneuralnetworksandthebrain
AT baldassarregianluca generaldifferentialhebbianlearningcapturingtemporalrelationsbetweeneventsinneuralnetworksandthebrain