Cargando…

The gradient clusteron: A model neuron that learns to solve classification tasks via dendritic nonlinearities, structural plasticity, and gradient descent

Synaptic clustering on neuronal dendrites has been hypothesized to play an important role in implementing pattern recognition. Neighboring synapses on a dendritic branch can interact in a synergistic, cooperative manner via nonlinear voltage-dependent mechanisms, such as NMDA receptors. Inspired by...

Descripción completa

Detalles Bibliográficos
Autores principales: Moldwin, Toviah, Kalmenson, Menachem, Segev, Idan
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Public Library of Science 2021
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8177649/
https://www.ncbi.nlm.nih.gov/pubmed/34029309
http://dx.doi.org/10.1371/journal.pcbi.1009015
_version_ 1783703426994012160
author Moldwin, Toviah
Kalmenson, Menachem
Segev, Idan
author_facet Moldwin, Toviah
Kalmenson, Menachem
Segev, Idan
author_sort Moldwin, Toviah
collection PubMed
description Synaptic clustering on neuronal dendrites has been hypothesized to play an important role in implementing pattern recognition. Neighboring synapses on a dendritic branch can interact in a synergistic, cooperative manner via nonlinear voltage-dependent mechanisms, such as NMDA receptors. Inspired by the NMDA receptor, the single-branch clusteron learning algorithm takes advantage of location-dependent multiplicative nonlinearities to solve classification tasks by randomly shuffling the locations of “under-performing” synapses on a model dendrite during learning (“structural plasticity”), eventually resulting in synapses with correlated activity being placed next to each other on the dendrite. We propose an alternative model, the gradient clusteron, or G-clusteron, which uses an analytically-derived gradient descent rule where synapses are "attracted to" or "repelled from" each other in an input- and location-dependent manner. We demonstrate the classification ability of this algorithm by testing it on the MNIST handwritten digit dataset and show that, when using a softmax activation function, the accuracy of the G-clusteron on the all-versus-all MNIST task (~85%) approaches that of logistic regression (~93%). In addition to the location update rule, we also derive a learning rule for the synaptic weights of the G-clusteron (“functional plasticity”) and show that a G-clusteron that utilizes the weight update rule can achieve ~89% accuracy on the MNIST task. We also show that a G-clusteron with both the weight and location update rules can learn to solve the XOR problem from arbitrary initial conditions.
format Online
Article
Text
id pubmed-8177649
institution National Center for Biotechnology Information
language English
publishDate 2021
publisher Public Library of Science
record_format MEDLINE/PubMed
spelling pubmed-81776492021-06-07 The gradient clusteron: A model neuron that learns to solve classification tasks via dendritic nonlinearities, structural plasticity, and gradient descent Moldwin, Toviah Kalmenson, Menachem Segev, Idan PLoS Comput Biol Research Article Synaptic clustering on neuronal dendrites has been hypothesized to play an important role in implementing pattern recognition. Neighboring synapses on a dendritic branch can interact in a synergistic, cooperative manner via nonlinear voltage-dependent mechanisms, such as NMDA receptors. Inspired by the NMDA receptor, the single-branch clusteron learning algorithm takes advantage of location-dependent multiplicative nonlinearities to solve classification tasks by randomly shuffling the locations of “under-performing” synapses on a model dendrite during learning (“structural plasticity”), eventually resulting in synapses with correlated activity being placed next to each other on the dendrite. We propose an alternative model, the gradient clusteron, or G-clusteron, which uses an analytically-derived gradient descent rule where synapses are "attracted to" or "repelled from" each other in an input- and location-dependent manner. We demonstrate the classification ability of this algorithm by testing it on the MNIST handwritten digit dataset and show that, when using a softmax activation function, the accuracy of the G-clusteron on the all-versus-all MNIST task (~85%) approaches that of logistic regression (~93%). In addition to the location update rule, we also derive a learning rule for the synaptic weights of the G-clusteron (“functional plasticity”) and show that a G-clusteron that utilizes the weight update rule can achieve ~89% accuracy on the MNIST task. We also show that a G-clusteron with both the weight and location update rules can learn to solve the XOR problem from arbitrary initial conditions. Public Library of Science 2021-05-24 /pmc/articles/PMC8177649/ /pubmed/34029309 http://dx.doi.org/10.1371/journal.pcbi.1009015 Text en © 2021 Moldwin et al https://creativecommons.org/licenses/by/4.0/This is an open access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/) , which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
spellingShingle Research Article
Moldwin, Toviah
Kalmenson, Menachem
Segev, Idan
The gradient clusteron: A model neuron that learns to solve classification tasks via dendritic nonlinearities, structural plasticity, and gradient descent
title The gradient clusteron: A model neuron that learns to solve classification tasks via dendritic nonlinearities, structural plasticity, and gradient descent
title_full The gradient clusteron: A model neuron that learns to solve classification tasks via dendritic nonlinearities, structural plasticity, and gradient descent
title_fullStr The gradient clusteron: A model neuron that learns to solve classification tasks via dendritic nonlinearities, structural plasticity, and gradient descent
title_full_unstemmed The gradient clusteron: A model neuron that learns to solve classification tasks via dendritic nonlinearities, structural plasticity, and gradient descent
title_short The gradient clusteron: A model neuron that learns to solve classification tasks via dendritic nonlinearities, structural plasticity, and gradient descent
title_sort gradient clusteron: a model neuron that learns to solve classification tasks via dendritic nonlinearities, structural plasticity, and gradient descent
topic Research Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8177649/
https://www.ncbi.nlm.nih.gov/pubmed/34029309
http://dx.doi.org/10.1371/journal.pcbi.1009015
work_keys_str_mv AT moldwintoviah thegradientclusteronamodelneuronthatlearnstosolveclassificationtasksviadendriticnonlinearitiesstructuralplasticityandgradientdescent
AT kalmensonmenachem thegradientclusteronamodelneuronthatlearnstosolveclassificationtasksviadendriticnonlinearitiesstructuralplasticityandgradientdescent
AT segevidan thegradientclusteronamodelneuronthatlearnstosolveclassificationtasksviadendriticnonlinearitiesstructuralplasticityandgradientdescent
AT moldwintoviah gradientclusteronamodelneuronthatlearnstosolveclassificationtasksviadendriticnonlinearitiesstructuralplasticityandgradientdescent
AT kalmensonmenachem gradientclusteronamodelneuronthatlearnstosolveclassificationtasksviadendriticnonlinearitiesstructuralplasticityandgradientdescent
AT segevidan gradientclusteronamodelneuronthatlearnstosolveclassificationtasksviadendriticnonlinearitiesstructuralplasticityandgradientdescent