Cargando…

Learning in Convolutional Neural Networks Accelerated by Transfer Entropy

Recently, there is a growing interest in applying Transfer Entropy (TE) in quantifying the effective connectivity between artificial neurons. In a feedforward network, the TE can be used to quantify the relationships between neuron output pairs located in different layers. Our focus is on how to inc...

Descripción completa

Detalles Bibliográficos
Autores principales: Moldovan, Adrian, Caţaron, Angel, Andonie, Răzvan
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2021
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8471588/
https://www.ncbi.nlm.nih.gov/pubmed/34573843
http://dx.doi.org/10.3390/e23091218
_version_ 1784574505903980544
author Moldovan, Adrian
Caţaron, Angel
Andonie, Răzvan
author_facet Moldovan, Adrian
Caţaron, Angel
Andonie, Răzvan
author_sort Moldovan, Adrian
collection PubMed
description Recently, there is a growing interest in applying Transfer Entropy (TE) in quantifying the effective connectivity between artificial neurons. In a feedforward network, the TE can be used to quantify the relationships between neuron output pairs located in different layers. Our focus is on how to include the TE in the learning mechanisms of a Convolutional Neural Network (CNN) architecture. We introduce a novel training mechanism for CNN architectures which integrates the TE feedback connections. Adding the TE feedback parameter accelerates the training process, as fewer epochs are needed. On the flip side, it adds computational overhead to each epoch. According to our experiments on CNN classifiers, to achieve a reasonable computational overhead–accuracy trade-off, it is efficient to consider only the inter-neural information transfer of the neuron pairs between the last two fully connected layers. The TE acts as a smoothing factor, generating stability and becoming active only periodically, not after processing each input sample. Therefore, we can consider the TE is in our model a slowly changing meta-parameter.
format Online
Article
Text
id pubmed-8471588
institution National Center for Biotechnology Information
language English
publishDate 2021
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-84715882021-09-28 Learning in Convolutional Neural Networks Accelerated by Transfer Entropy Moldovan, Adrian Caţaron, Angel Andonie, Răzvan Entropy (Basel) Article Recently, there is a growing interest in applying Transfer Entropy (TE) in quantifying the effective connectivity between artificial neurons. In a feedforward network, the TE can be used to quantify the relationships between neuron output pairs located in different layers. Our focus is on how to include the TE in the learning mechanisms of a Convolutional Neural Network (CNN) architecture. We introduce a novel training mechanism for CNN architectures which integrates the TE feedback connections. Adding the TE feedback parameter accelerates the training process, as fewer epochs are needed. On the flip side, it adds computational overhead to each epoch. According to our experiments on CNN classifiers, to achieve a reasonable computational overhead–accuracy trade-off, it is efficient to consider only the inter-neural information transfer of the neuron pairs between the last two fully connected layers. The TE acts as a smoothing factor, generating stability and becoming active only periodically, not after processing each input sample. Therefore, we can consider the TE is in our model a slowly changing meta-parameter. MDPI 2021-09-16 /pmc/articles/PMC8471588/ /pubmed/34573843 http://dx.doi.org/10.3390/e23091218 Text en © 2021 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Moldovan, Adrian
Caţaron, Angel
Andonie, Răzvan
Learning in Convolutional Neural Networks Accelerated by Transfer Entropy
title Learning in Convolutional Neural Networks Accelerated by Transfer Entropy
title_full Learning in Convolutional Neural Networks Accelerated by Transfer Entropy
title_fullStr Learning in Convolutional Neural Networks Accelerated by Transfer Entropy
title_full_unstemmed Learning in Convolutional Neural Networks Accelerated by Transfer Entropy
title_short Learning in Convolutional Neural Networks Accelerated by Transfer Entropy
title_sort learning in convolutional neural networks accelerated by transfer entropy
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8471588/
https://www.ncbi.nlm.nih.gov/pubmed/34573843
http://dx.doi.org/10.3390/e23091218
work_keys_str_mv AT moldovanadrian learninginconvolutionalneuralnetworksacceleratedbytransferentropy
AT cataronangel learninginconvolutionalneuralnetworksacceleratedbytransferentropy
AT andonierazvan learninginconvolutionalneuralnetworksacceleratedbytransferentropy