Cargando…
Downward-Growing Neural Networks
A major issue in the application of deep learning is the definition of a proper architecture for the learning machine at hand, in such a way that the model is neither excessively large (which results in overfitting the training data) nor too small (which limits the learning and modeling capabilities...
Autores principales: | , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
MDPI
2023
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10217234/ https://www.ncbi.nlm.nih.gov/pubmed/37238488 http://dx.doi.org/10.3390/e25050733 |
_version_ | 1785048487904149504 |
---|---|
author | Laveglia, Vincenzo Trentin, Edmondo |
author_facet | Laveglia, Vincenzo Trentin, Edmondo |
author_sort | Laveglia, Vincenzo |
collection | PubMed |
description | A major issue in the application of deep learning is the definition of a proper architecture for the learning machine at hand, in such a way that the model is neither excessively large (which results in overfitting the training data) nor too small (which limits the learning and modeling capabilities of the automatic learner). Facing this issue boosted the development of algorithms for automatically growing and pruning the architectures as part of the learning process. The paper introduces a novel approach to growing the architecture of deep neural networks, called downward-growing neural network (DGNN). The approach can be applied to arbitrary feed-forward deep neural networks. Groups of neurons that negatively affect the performance of the network are selected and grown with the aim of improving the learning and generalization capabilities of the resulting machine. The growing process is realized via replacement of these groups of neurons with sub-networks that are trained relying on ad hoc target propagation techniques. In so doing, the growth process takes place simultaneously in both the depth and width of the DGNN architecture. We assess empirically the effectiveness of the DGNN on several UCI datasets, where the DGNN significantly improves the average accuracy over a range of established deep neural network approaches and over two popular growing algorithms, namely, the AdaNet and the cascade correlation neural network. |
format | Online Article Text |
id | pubmed-10217234 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2023 |
publisher | MDPI |
record_format | MEDLINE/PubMed |
spelling | pubmed-102172342023-05-27 Downward-Growing Neural Networks Laveglia, Vincenzo Trentin, Edmondo Entropy (Basel) Article A major issue in the application of deep learning is the definition of a proper architecture for the learning machine at hand, in such a way that the model is neither excessively large (which results in overfitting the training data) nor too small (which limits the learning and modeling capabilities of the automatic learner). Facing this issue boosted the development of algorithms for automatically growing and pruning the architectures as part of the learning process. The paper introduces a novel approach to growing the architecture of deep neural networks, called downward-growing neural network (DGNN). The approach can be applied to arbitrary feed-forward deep neural networks. Groups of neurons that negatively affect the performance of the network are selected and grown with the aim of improving the learning and generalization capabilities of the resulting machine. The growing process is realized via replacement of these groups of neurons with sub-networks that are trained relying on ad hoc target propagation techniques. In so doing, the growth process takes place simultaneously in both the depth and width of the DGNN architecture. We assess empirically the effectiveness of the DGNN on several UCI datasets, where the DGNN significantly improves the average accuracy over a range of established deep neural network approaches and over two popular growing algorithms, namely, the AdaNet and the cascade correlation neural network. MDPI 2023-04-28 /pmc/articles/PMC10217234/ /pubmed/37238488 http://dx.doi.org/10.3390/e25050733 Text en © 2023 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). |
spellingShingle | Article Laveglia, Vincenzo Trentin, Edmondo Downward-Growing Neural Networks |
title | Downward-Growing Neural Networks |
title_full | Downward-Growing Neural Networks |
title_fullStr | Downward-Growing Neural Networks |
title_full_unstemmed | Downward-Growing Neural Networks |
title_short | Downward-Growing Neural Networks |
title_sort | downward-growing neural networks |
topic | Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10217234/ https://www.ncbi.nlm.nih.gov/pubmed/37238488 http://dx.doi.org/10.3390/e25050733 |
work_keys_str_mv | AT lavegliavincenzo downwardgrowingneuralnetworks AT trentinedmondo downwardgrowingneuralnetworks |