Cargando…
Information-Theoretic Intrinsic Plasticity for Online Unsupervised Learning in Spiking Neural Networks
As a self-adaptive mechanism, intrinsic plasticity (IP) plays an essential role in maintaining homeostasis and shaping the dynamics of neural circuits. From a computational point of view, IP has the potential to enable promising non-Hebbian learning in artificial neural networks. While IP based lear...
Autores principales: | , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Frontiers Media S.A.
2019
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6371195/ https://www.ncbi.nlm.nih.gov/pubmed/30804736 http://dx.doi.org/10.3389/fnins.2019.00031 |
_version_ | 1783394526669307904 |
---|---|
author | Zhang, Wenrui Li, Peng |
author_facet | Zhang, Wenrui Li, Peng |
author_sort | Zhang, Wenrui |
collection | PubMed |
description | As a self-adaptive mechanism, intrinsic plasticity (IP) plays an essential role in maintaining homeostasis and shaping the dynamics of neural circuits. From a computational point of view, IP has the potential to enable promising non-Hebbian learning in artificial neural networks. While IP based learning has been attempted for spiking neuron models, the existing IP rules are ad hoc in nature, and the practical success of their application has not been demonstrated particularly toward enabling real-life learning tasks. This work aims to address the theoretical and practical limitations of the existing works by proposing a new IP rule named SpiKL-IP. SpiKL-IP is developed based on a rigorous information-theoretic approach where the target of IP tuning is to maximize the entropy of the output firing rate distribution of each spiking neuron. This goal is achieved by tuning the output firing rate distribution toward a targeted optimal exponential distribution. Operating on a proposed firing-rate transfer function, SpiKL-IP adapts the intrinsic parameters of a spiking neuron while minimizing the KL-divergence from the targeted exponential distribution to the actual output firing rate distribution. SpiKL-IP can robustly operate in an online manner under complex inputs and network settings. Simulation studies demonstrate that the application of SpiKL-IP to individual neurons in isolation or as part of a larger spiking neural network robustly produces the desired exponential distribution. The evaluation of SpiKL-IP under real-world speech and image classification tasks shows that SpiKL-IP noticeably outperforms two existing IP rules and can significantly boost recognition accuracy by up to more than 16%. |
format | Online Article Text |
id | pubmed-6371195 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2019 |
publisher | Frontiers Media S.A. |
record_format | MEDLINE/PubMed |
spelling | pubmed-63711952019-02-25 Information-Theoretic Intrinsic Plasticity for Online Unsupervised Learning in Spiking Neural Networks Zhang, Wenrui Li, Peng Front Neurosci Neuroscience As a self-adaptive mechanism, intrinsic plasticity (IP) plays an essential role in maintaining homeostasis and shaping the dynamics of neural circuits. From a computational point of view, IP has the potential to enable promising non-Hebbian learning in artificial neural networks. While IP based learning has been attempted for spiking neuron models, the existing IP rules are ad hoc in nature, and the practical success of their application has not been demonstrated particularly toward enabling real-life learning tasks. This work aims to address the theoretical and practical limitations of the existing works by proposing a new IP rule named SpiKL-IP. SpiKL-IP is developed based on a rigorous information-theoretic approach where the target of IP tuning is to maximize the entropy of the output firing rate distribution of each spiking neuron. This goal is achieved by tuning the output firing rate distribution toward a targeted optimal exponential distribution. Operating on a proposed firing-rate transfer function, SpiKL-IP adapts the intrinsic parameters of a spiking neuron while minimizing the KL-divergence from the targeted exponential distribution to the actual output firing rate distribution. SpiKL-IP can robustly operate in an online manner under complex inputs and network settings. Simulation studies demonstrate that the application of SpiKL-IP to individual neurons in isolation or as part of a larger spiking neural network robustly produces the desired exponential distribution. The evaluation of SpiKL-IP under real-world speech and image classification tasks shows that SpiKL-IP noticeably outperforms two existing IP rules and can significantly boost recognition accuracy by up to more than 16%. Frontiers Media S.A. 2019-02-05 /pmc/articles/PMC6371195/ /pubmed/30804736 http://dx.doi.org/10.3389/fnins.2019.00031 Text en Copyright © 2019 Zhang and Li. http://creativecommons.org/licenses/by/4.0/ This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms. |
spellingShingle | Neuroscience Zhang, Wenrui Li, Peng Information-Theoretic Intrinsic Plasticity for Online Unsupervised Learning in Spiking Neural Networks |
title | Information-Theoretic Intrinsic Plasticity for Online Unsupervised Learning in Spiking Neural Networks |
title_full | Information-Theoretic Intrinsic Plasticity for Online Unsupervised Learning in Spiking Neural Networks |
title_fullStr | Information-Theoretic Intrinsic Plasticity for Online Unsupervised Learning in Spiking Neural Networks |
title_full_unstemmed | Information-Theoretic Intrinsic Plasticity for Online Unsupervised Learning in Spiking Neural Networks |
title_short | Information-Theoretic Intrinsic Plasticity for Online Unsupervised Learning in Spiking Neural Networks |
title_sort | information-theoretic intrinsic plasticity for online unsupervised learning in spiking neural networks |
topic | Neuroscience |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6371195/ https://www.ncbi.nlm.nih.gov/pubmed/30804736 http://dx.doi.org/10.3389/fnins.2019.00031 |
work_keys_str_mv | AT zhangwenrui informationtheoreticintrinsicplasticityforonlineunsupervisedlearninginspikingneuralnetworks AT lipeng informationtheoreticintrinsicplasticityforonlineunsupervisedlearninginspikingneuralnetworks |