Cargando…

Error-Gated Hebbian Rule: A Local Learning Rule for Principal and Independent Component Analysis

We developed a biologically plausible unsupervised learning algorithm, error-gated Hebbian rule (EGHR)-β, that performs principal component analysis (PCA) and independent component analysis (ICA) in a single-layer feedforward neural network. If parameter β = 1, it can extract the subspace that major...

Descripción completa

Detalles Bibliográficos
Autores principales: Isomura, Takuya, Toyoizumi, Taro
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Nature Publishing Group UK 2018
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5789861/
https://www.ncbi.nlm.nih.gov/pubmed/29382868
http://dx.doi.org/10.1038/s41598-018-20082-0
_version_ 1783296364799590400
author Isomura, Takuya
Toyoizumi, Taro
author_facet Isomura, Takuya
Toyoizumi, Taro
author_sort Isomura, Takuya
collection PubMed
description We developed a biologically plausible unsupervised learning algorithm, error-gated Hebbian rule (EGHR)-β, that performs principal component analysis (PCA) and independent component analysis (ICA) in a single-layer feedforward neural network. If parameter β = 1, it can extract the subspace that major principal components span similarly to Oja’s subspace rule for PCA. If β = 0, it can separate independent sources similarly to Bell-Sejnowski’s ICA rule but without requiring the same number of input and output neurons. Unlike these engineering rules, the EGHR-β can be easily implemented in a biological or neuromorphic circuit because it only uses local information available at each synapse. We analytically and numerically demonstrate the reliability of the EGHR-β in extracting and separating major sources given high-dimensional input. By adjusting β, the EGHR-β can extract sources that are missed by the conventional engineering approach that first applies PCA and then ICA. Namely, the proposed rule can successfully extract hidden natural images even in the presence of dominant or non-Gaussian noise components. The results highlight the reliability and utility of the EGHR-β for large-scale parallel computation of PCA and ICA and its future implementation in a neuromorphic hardware.
format Online
Article
Text
id pubmed-5789861
institution National Center for Biotechnology Information
language English
publishDate 2018
publisher Nature Publishing Group UK
record_format MEDLINE/PubMed
spelling pubmed-57898612018-02-15 Error-Gated Hebbian Rule: A Local Learning Rule for Principal and Independent Component Analysis Isomura, Takuya Toyoizumi, Taro Sci Rep Article We developed a biologically plausible unsupervised learning algorithm, error-gated Hebbian rule (EGHR)-β, that performs principal component analysis (PCA) and independent component analysis (ICA) in a single-layer feedforward neural network. If parameter β = 1, it can extract the subspace that major principal components span similarly to Oja’s subspace rule for PCA. If β = 0, it can separate independent sources similarly to Bell-Sejnowski’s ICA rule but without requiring the same number of input and output neurons. Unlike these engineering rules, the EGHR-β can be easily implemented in a biological or neuromorphic circuit because it only uses local information available at each synapse. We analytically and numerically demonstrate the reliability of the EGHR-β in extracting and separating major sources given high-dimensional input. By adjusting β, the EGHR-β can extract sources that are missed by the conventional engineering approach that first applies PCA and then ICA. Namely, the proposed rule can successfully extract hidden natural images even in the presence of dominant or non-Gaussian noise components. The results highlight the reliability and utility of the EGHR-β for large-scale parallel computation of PCA and ICA and its future implementation in a neuromorphic hardware. Nature Publishing Group UK 2018-01-30 /pmc/articles/PMC5789861/ /pubmed/29382868 http://dx.doi.org/10.1038/s41598-018-20082-0 Text en © The Author(s) 2018 Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/.
spellingShingle Article
Isomura, Takuya
Toyoizumi, Taro
Error-Gated Hebbian Rule: A Local Learning Rule for Principal and Independent Component Analysis
title Error-Gated Hebbian Rule: A Local Learning Rule for Principal and Independent Component Analysis
title_full Error-Gated Hebbian Rule: A Local Learning Rule for Principal and Independent Component Analysis
title_fullStr Error-Gated Hebbian Rule: A Local Learning Rule for Principal and Independent Component Analysis
title_full_unstemmed Error-Gated Hebbian Rule: A Local Learning Rule for Principal and Independent Component Analysis
title_short Error-Gated Hebbian Rule: A Local Learning Rule for Principal and Independent Component Analysis
title_sort error-gated hebbian rule: a local learning rule for principal and independent component analysis
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5789861/
https://www.ncbi.nlm.nih.gov/pubmed/29382868
http://dx.doi.org/10.1038/s41598-018-20082-0
work_keys_str_mv AT isomuratakuya errorgatedhebbianrulealocallearningruleforprincipalandindependentcomponentanalysis
AT toyoizumitaro errorgatedhebbianrulealocallearningruleforprincipalandindependentcomponentanalysis