Cargando…
A hybridizing-enhanced differential evolution for optimization
Differential evolution (DE) belongs to the most usable optimization algorithms, presented in many improved and modern versions in recent years. Generally, the low convergence rate is the main drawback of the DE algorithm. In this article, the gray wolf optimizer (GWO) is used to accelerate the conve...
Autores principales: | , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
PeerJ Inc.
2023
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10280462/ https://www.ncbi.nlm.nih.gov/pubmed/37346618 http://dx.doi.org/10.7717/peerj-cs.1420 |
_version_ | 1785060799736184832 |
---|---|
author | Ghasemi, Mojtaba Zare, Mohsen Trojovský, Pavel Zahedibialvaei, Amir Trojovská, Eva |
author_facet | Ghasemi, Mojtaba Zare, Mohsen Trojovský, Pavel Zahedibialvaei, Amir Trojovská, Eva |
author_sort | Ghasemi, Mojtaba |
collection | PubMed |
description | Differential evolution (DE) belongs to the most usable optimization algorithms, presented in many improved and modern versions in recent years. Generally, the low convergence rate is the main drawback of the DE algorithm. In this article, the gray wolf optimizer (GWO) is used to accelerate the convergence rate and the final optimal results of the DE algorithm. The new resulting algorithm is called Hunting Differential Evolution (HDE). The proposed HDE algorithm deploys the convergence speed of the GWO algorithm as well as the appropriate searching capability of the DE algorithm. Furthermore, by adjusting the crossover rate and mutation probability parameters, this algorithm can be adjusted to pay closer attention to the strengths of each of these two algorithms. The HDE/current-to-rand/1 performed the best on CEC-2019 functions compared to the other eight variants of HDE. HDE/current-to-best/1 is also chosen as having superior performance to other proposed HDE compared to seven improved algorithms on CEC-2014 functions, outperforming them in 15 test functions. Furthermore, jHDE performs well by improving in 17 functions, compared with jDE on these functions. The simulations indicate that the proposed HDE algorithm can provide reliable outcomes in finding the optimal solutions with a rapid convergence rate and avoiding the local minimum compared to the original DE algorithm. |
format | Online Article Text |
id | pubmed-10280462 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2023 |
publisher | PeerJ Inc. |
record_format | MEDLINE/PubMed |
spelling | pubmed-102804622023-06-21 A hybridizing-enhanced differential evolution for optimization Ghasemi, Mojtaba Zare, Mohsen Trojovský, Pavel Zahedibialvaei, Amir Trojovská, Eva PeerJ Comput Sci Algorithms and Analysis of Algorithms Differential evolution (DE) belongs to the most usable optimization algorithms, presented in many improved and modern versions in recent years. Generally, the low convergence rate is the main drawback of the DE algorithm. In this article, the gray wolf optimizer (GWO) is used to accelerate the convergence rate and the final optimal results of the DE algorithm. The new resulting algorithm is called Hunting Differential Evolution (HDE). The proposed HDE algorithm deploys the convergence speed of the GWO algorithm as well as the appropriate searching capability of the DE algorithm. Furthermore, by adjusting the crossover rate and mutation probability parameters, this algorithm can be adjusted to pay closer attention to the strengths of each of these two algorithms. The HDE/current-to-rand/1 performed the best on CEC-2019 functions compared to the other eight variants of HDE. HDE/current-to-best/1 is also chosen as having superior performance to other proposed HDE compared to seven improved algorithms on CEC-2014 functions, outperforming them in 15 test functions. Furthermore, jHDE performs well by improving in 17 functions, compared with jDE on these functions. The simulations indicate that the proposed HDE algorithm can provide reliable outcomes in finding the optimal solutions with a rapid convergence rate and avoiding the local minimum compared to the original DE algorithm. PeerJ Inc. 2023-06-01 /pmc/articles/PMC10280462/ /pubmed/37346618 http://dx.doi.org/10.7717/peerj-cs.1420 Text en © 2023 Ghasemi et al. https://creativecommons.org/licenses/by/4.0/This is an open access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/) , which permits unrestricted use, distribution, reproduction and adaptation in any medium and for any purpose provided that it is properly attributed. For attribution, the original author(s), title, publication source (PeerJ Computer Science) and either DOI or URL of the article must be cited. |
spellingShingle | Algorithms and Analysis of Algorithms Ghasemi, Mojtaba Zare, Mohsen Trojovský, Pavel Zahedibialvaei, Amir Trojovská, Eva A hybridizing-enhanced differential evolution for optimization |
title | A hybridizing-enhanced differential evolution for optimization |
title_full | A hybridizing-enhanced differential evolution for optimization |
title_fullStr | A hybridizing-enhanced differential evolution for optimization |
title_full_unstemmed | A hybridizing-enhanced differential evolution for optimization |
title_short | A hybridizing-enhanced differential evolution for optimization |
title_sort | hybridizing-enhanced differential evolution for optimization |
topic | Algorithms and Analysis of Algorithms |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10280462/ https://www.ncbi.nlm.nih.gov/pubmed/37346618 http://dx.doi.org/10.7717/peerj-cs.1420 |
work_keys_str_mv | AT ghasemimojtaba ahybridizingenhanceddifferentialevolutionforoptimization AT zaremohsen ahybridizingenhanceddifferentialevolutionforoptimization AT trojovskypavel ahybridizingenhanceddifferentialevolutionforoptimization AT zahedibialvaeiamir ahybridizingenhanceddifferentialevolutionforoptimization AT trojovskaeva ahybridizingenhanceddifferentialevolutionforoptimization AT ghasemimojtaba hybridizingenhanceddifferentialevolutionforoptimization AT zaremohsen hybridizingenhanceddifferentialevolutionforoptimization AT trojovskypavel hybridizingenhanceddifferentialevolutionforoptimization AT zahedibialvaeiamir hybridizingenhanceddifferentialevolutionforoptimization AT trojovskaeva hybridizingenhanceddifferentialevolutionforoptimization |