Cargando…
Modern synergetic neural network for imbalanced small data classification
Deep learning’s performance on the imbalanced small data is substantially degraded by overfitting. Recurrent neural networks retain better performance in such tasks by constructing dynamical systems for robustness. Synergetic neural network (SNN), a synergetic-based recurrent neural network, has sup...
Autores principales: | , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Nature Publishing Group UK
2023
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10514188/ https://www.ncbi.nlm.nih.gov/pubmed/37735230 http://dx.doi.org/10.1038/s41598-023-42689-8 |
_version_ | 1785108675123216384 |
---|---|
author | Wang, Zihao Li, Haifeng Ma, Lin |
author_facet | Wang, Zihao Li, Haifeng Ma, Lin |
author_sort | Wang, Zihao |
collection | PubMed |
description | Deep learning’s performance on the imbalanced small data is substantially degraded by overfitting. Recurrent neural networks retain better performance in such tasks by constructing dynamical systems for robustness. Synergetic neural network (SNN), a synergetic-based recurrent neural network, has superiorities in eliminating recall errors and pseudo memories, but is subject to frequent association errors. Since the cause remains unclear, most subsequent studies use genetic algorithms to adjust parameters for better accuracy, which occupies the parameter optimization space and hinders task-oriented tuning. To solve the problem and promote SNN’s application capability, we propose the modern synergetic neural network (MSNN) model. MSNN solves the association error by correcting the state initialization method in the working process, liberating the parameter optimization space. In addition, MSNN optimizes the attention parameter of the network with the error backpropagation algorithm and the gradient bypass technique to allow the network to be trained jointly with other network layers. The self-learning of the attention parameter empowers the adaptation to the imbalanced sample size, further improving the classification performance. In 75 classification tasks of small UC Irvine Machine Learning Datasets, the average rank of the MSNN achieves the best result compared to 187 neural and non-neural network machine learning methods. |
format | Online Article Text |
id | pubmed-10514188 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2023 |
publisher | Nature Publishing Group UK |
record_format | MEDLINE/PubMed |
spelling | pubmed-105141882023-09-23 Modern synergetic neural network for imbalanced small data classification Wang, Zihao Li, Haifeng Ma, Lin Sci Rep Article Deep learning’s performance on the imbalanced small data is substantially degraded by overfitting. Recurrent neural networks retain better performance in such tasks by constructing dynamical systems for robustness. Synergetic neural network (SNN), a synergetic-based recurrent neural network, has superiorities in eliminating recall errors and pseudo memories, but is subject to frequent association errors. Since the cause remains unclear, most subsequent studies use genetic algorithms to adjust parameters for better accuracy, which occupies the parameter optimization space and hinders task-oriented tuning. To solve the problem and promote SNN’s application capability, we propose the modern synergetic neural network (MSNN) model. MSNN solves the association error by correcting the state initialization method in the working process, liberating the parameter optimization space. In addition, MSNN optimizes the attention parameter of the network with the error backpropagation algorithm and the gradient bypass technique to allow the network to be trained jointly with other network layers. The self-learning of the attention parameter empowers the adaptation to the imbalanced sample size, further improving the classification performance. In 75 classification tasks of small UC Irvine Machine Learning Datasets, the average rank of the MSNN achieves the best result compared to 187 neural and non-neural network machine learning methods. Nature Publishing Group UK 2023-09-21 /pmc/articles/PMC10514188/ /pubmed/37735230 http://dx.doi.org/10.1038/s41598-023-42689-8 Text en © The Author(s) 2023 https://creativecommons.org/licenses/by/4.0/Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) . |
spellingShingle | Article Wang, Zihao Li, Haifeng Ma, Lin Modern synergetic neural network for imbalanced small data classification |
title | Modern synergetic neural network for imbalanced small data classification |
title_full | Modern synergetic neural network for imbalanced small data classification |
title_fullStr | Modern synergetic neural network for imbalanced small data classification |
title_full_unstemmed | Modern synergetic neural network for imbalanced small data classification |
title_short | Modern synergetic neural network for imbalanced small data classification |
title_sort | modern synergetic neural network for imbalanced small data classification |
topic | Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10514188/ https://www.ncbi.nlm.nih.gov/pubmed/37735230 http://dx.doi.org/10.1038/s41598-023-42689-8 |
work_keys_str_mv | AT wangzihao modernsynergeticneuralnetworkforimbalancedsmalldataclassification AT lihaifeng modernsynergeticneuralnetworkforimbalancedsmalldataclassification AT malin modernsynergeticneuralnetworkforimbalancedsmalldataclassification |