Cargando…

Synaptic metaplasticity in binarized neural networks

While deep neural networks have surpassed human performance in multiple situations, they are prone to catastrophic forgetting: upon training a new task, they rapidly forget previously learned ones. Neuroscience studies, based on idealized tasks, suggest that in the brain, synapses overcome this issu...

Descripción completa

Detalles Bibliográficos
Autores principales: Laborieux, Axel, Ernoult, Maxence, Hirtzlin, Tifenn, Querlioz, Damien
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Nature Publishing Group UK 2021
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8100137/
https://www.ncbi.nlm.nih.gov/pubmed/33953183
http://dx.doi.org/10.1038/s41467-021-22768-y
_version_ 1783688716807569408
author Laborieux, Axel
Ernoult, Maxence
Hirtzlin, Tifenn
Querlioz, Damien
author_facet Laborieux, Axel
Ernoult, Maxence
Hirtzlin, Tifenn
Querlioz, Damien
author_sort Laborieux, Axel
collection PubMed
description While deep neural networks have surpassed human performance in multiple situations, they are prone to catastrophic forgetting: upon training a new task, they rapidly forget previously learned ones. Neuroscience studies, based on idealized tasks, suggest that in the brain, synapses overcome this issue by adjusting their plasticity depending on their past history. However, such “metaplastic” behaviors do not transfer directly to mitigate catastrophic forgetting in deep neural networks. In this work, we interpret the hidden weights used by binarized neural networks, a low-precision version of deep neural networks, as metaplastic variables, and modify their training technique to alleviate forgetting. Building on this idea, we propose and demonstrate experimentally, in situations of multitask and stream learning, a training technique that reduces catastrophic forgetting without needing previously presented data, nor formal boundaries between datasets and with performance approaching more mainstream techniques with task boundaries. We support our approach with a theoretical analysis on a tractable task. This work bridges computational neuroscience and deep learning, and presents significant assets for future embedded and neuromorphic systems, especially when using novel nanodevices featuring physics analogous to metaplasticity.
format Online
Article
Text
id pubmed-8100137
institution National Center for Biotechnology Information
language English
publishDate 2021
publisher Nature Publishing Group UK
record_format MEDLINE/PubMed
spelling pubmed-81001372021-05-11 Synaptic metaplasticity in binarized neural networks Laborieux, Axel Ernoult, Maxence Hirtzlin, Tifenn Querlioz, Damien Nat Commun Article While deep neural networks have surpassed human performance in multiple situations, they are prone to catastrophic forgetting: upon training a new task, they rapidly forget previously learned ones. Neuroscience studies, based on idealized tasks, suggest that in the brain, synapses overcome this issue by adjusting their plasticity depending on their past history. However, such “metaplastic” behaviors do not transfer directly to mitigate catastrophic forgetting in deep neural networks. In this work, we interpret the hidden weights used by binarized neural networks, a low-precision version of deep neural networks, as metaplastic variables, and modify their training technique to alleviate forgetting. Building on this idea, we propose and demonstrate experimentally, in situations of multitask and stream learning, a training technique that reduces catastrophic forgetting without needing previously presented data, nor formal boundaries between datasets and with performance approaching more mainstream techniques with task boundaries. We support our approach with a theoretical analysis on a tractable task. This work bridges computational neuroscience and deep learning, and presents significant assets for future embedded and neuromorphic systems, especially when using novel nanodevices featuring physics analogous to metaplasticity. Nature Publishing Group UK 2021-05-05 /pmc/articles/PMC8100137/ /pubmed/33953183 http://dx.doi.org/10.1038/s41467-021-22768-y Text en © The Author(s) 2021 https://creativecommons.org/licenses/by/4.0/Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) .
spellingShingle Article
Laborieux, Axel
Ernoult, Maxence
Hirtzlin, Tifenn
Querlioz, Damien
Synaptic metaplasticity in binarized neural networks
title Synaptic metaplasticity in binarized neural networks
title_full Synaptic metaplasticity in binarized neural networks
title_fullStr Synaptic metaplasticity in binarized neural networks
title_full_unstemmed Synaptic metaplasticity in binarized neural networks
title_short Synaptic metaplasticity in binarized neural networks
title_sort synaptic metaplasticity in binarized neural networks
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8100137/
https://www.ncbi.nlm.nih.gov/pubmed/33953183
http://dx.doi.org/10.1038/s41467-021-22768-y
work_keys_str_mv AT laborieuxaxel synapticmetaplasticityinbinarizedneuralnetworks
AT ernoultmaxence synapticmetaplasticityinbinarizedneuralnetworks
AT hirtzlintifenn synapticmetaplasticityinbinarizedneuralnetworks
AT querliozdamien synapticmetaplasticityinbinarizedneuralnetworks