Cargando…

Improved GWO and its application in parameter optimization of Elman neural network

Traditional neural networks used gradient descent methods to train the network structure, which cannot handle complex optimization problems. We proposed an improved grey wolf optimizer (SGWO) to explore a better network structure. GWO was improved by using circle population initialization, informati...

Descripción completa

Detalles Bibliográficos
Autores principales: Liu, Wei, Sun, Jiayang, Liu, Guangwei, Fu, Saiou, Liu, Mengyuan, Zhu, Yixin, Gao, Qi
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Public Library of Science 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10328355/
https://www.ncbi.nlm.nih.gov/pubmed/37418374
http://dx.doi.org/10.1371/journal.pone.0288071
_version_ 1785069780283162624
author Liu, Wei
Sun, Jiayang
Liu, Guangwei
Fu, Saiou
Liu, Mengyuan
Zhu, Yixin
Gao, Qi
author_facet Liu, Wei
Sun, Jiayang
Liu, Guangwei
Fu, Saiou
Liu, Mengyuan
Zhu, Yixin
Gao, Qi
author_sort Liu, Wei
collection PubMed
description Traditional neural networks used gradient descent methods to train the network structure, which cannot handle complex optimization problems. We proposed an improved grey wolf optimizer (SGWO) to explore a better network structure. GWO was improved by using circle population initialization, information interaction mechanism and adaptive position update to enhance the search performance of the algorithm. SGWO was applied to optimize Elman network structure, and a new prediction method (SGWO-Elman) was proposed. The convergence of SGWO was analyzed by mathematical theory, and the optimization ability of SGWO and the prediction performance of SGWO-Elman were examined using comparative experiments. The results show: (1) the global convergence probability of SGWO was 1, and its process was a finite homogeneous Markov chain with an absorption state; (2) SGWO not only has better optimization performance when solving complex functions of different dimensions, but also when applied to Elman for parameter optimization, SGWO can significantly optimize the network structure and SGWO-Elman has accurate prediction performance.
format Online
Article
Text
id pubmed-10328355
institution National Center for Biotechnology Information
language English
publishDate 2023
publisher Public Library of Science
record_format MEDLINE/PubMed
spelling pubmed-103283552023-07-08 Improved GWO and its application in parameter optimization of Elman neural network Liu, Wei Sun, Jiayang Liu, Guangwei Fu, Saiou Liu, Mengyuan Zhu, Yixin Gao, Qi PLoS One Research Article Traditional neural networks used gradient descent methods to train the network structure, which cannot handle complex optimization problems. We proposed an improved grey wolf optimizer (SGWO) to explore a better network structure. GWO was improved by using circle population initialization, information interaction mechanism and adaptive position update to enhance the search performance of the algorithm. SGWO was applied to optimize Elman network structure, and a new prediction method (SGWO-Elman) was proposed. The convergence of SGWO was analyzed by mathematical theory, and the optimization ability of SGWO and the prediction performance of SGWO-Elman were examined using comparative experiments. The results show: (1) the global convergence probability of SGWO was 1, and its process was a finite homogeneous Markov chain with an absorption state; (2) SGWO not only has better optimization performance when solving complex functions of different dimensions, but also when applied to Elman for parameter optimization, SGWO can significantly optimize the network structure and SGWO-Elman has accurate prediction performance. Public Library of Science 2023-07-07 /pmc/articles/PMC10328355/ /pubmed/37418374 http://dx.doi.org/10.1371/journal.pone.0288071 Text en © 2023 Liu et al https://creativecommons.org/licenses/by/4.0/This is an open access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/) , which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
spellingShingle Research Article
Liu, Wei
Sun, Jiayang
Liu, Guangwei
Fu, Saiou
Liu, Mengyuan
Zhu, Yixin
Gao, Qi
Improved GWO and its application in parameter optimization of Elman neural network
title Improved GWO and its application in parameter optimization of Elman neural network
title_full Improved GWO and its application in parameter optimization of Elman neural network
title_fullStr Improved GWO and its application in parameter optimization of Elman neural network
title_full_unstemmed Improved GWO and its application in parameter optimization of Elman neural network
title_short Improved GWO and its application in parameter optimization of Elman neural network
title_sort improved gwo and its application in parameter optimization of elman neural network
topic Research Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10328355/
https://www.ncbi.nlm.nih.gov/pubmed/37418374
http://dx.doi.org/10.1371/journal.pone.0288071
work_keys_str_mv AT liuwei improvedgwoanditsapplicationinparameteroptimizationofelmanneuralnetwork
AT sunjiayang improvedgwoanditsapplicationinparameteroptimizationofelmanneuralnetwork
AT liuguangwei improvedgwoanditsapplicationinparameteroptimizationofelmanneuralnetwork
AT fusaiou improvedgwoanditsapplicationinparameteroptimizationofelmanneuralnetwork
AT liumengyuan improvedgwoanditsapplicationinparameteroptimizationofelmanneuralnetwork
AT zhuyixin improvedgwoanditsapplicationinparameteroptimizationofelmanneuralnetwork
AT gaoqi improvedgwoanditsapplicationinparameteroptimizationofelmanneuralnetwork