Cargando…
Hybrid Gradient Descent Grey Wolf Optimizer for Optimal Feature Selection
Feature selection is the process of decreasing the number of features in a dataset by removing redundant, irrelevant, and randomly class-corrected data features. By applying feature selection on large and highly dimensional datasets, the redundant features are removed, reducing the complexity of the...
Autores principales: | , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Hindawi
2021
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8421169/ https://www.ncbi.nlm.nih.gov/pubmed/34497846 http://dx.doi.org/10.1155/2021/2555622 |
_version_ | 1783749021410525184 |
---|---|
author | Kitonyi, Peter Mule Segera, Davies Rene |
author_facet | Kitonyi, Peter Mule Segera, Davies Rene |
author_sort | Kitonyi, Peter Mule |
collection | PubMed |
description | Feature selection is the process of decreasing the number of features in a dataset by removing redundant, irrelevant, and randomly class-corrected data features. By applying feature selection on large and highly dimensional datasets, the redundant features are removed, reducing the complexity of the data and reducing training time. The objective of this paper was to design an optimizer that combines the well-known metaheuristic population-based optimizer, the grey wolf algorithm, and the gradient descent algorithm and test it for applications in feature selection problems. The proposed algorithm was first compared against the original grey wolf algorithm in 23 continuous test functions. The proposed optimizer was altered for feature selection, and 3 binary implementations were developed with final implementation compared against the two implementations of the binary grey wolf optimizer and binary grey wolf particle swarm optimizer on 6 medical datasets from the UCI machine learning repository, on metrics such as accuracy, size of feature subsets, F-measure, accuracy, precision, and sensitivity. The proposed optimizer outperformed the three other optimizers in 3 of the 6 datasets in average metrics. The proposed optimizer showed promise in its capability to balance the two objectives in feature selection and could be further enhanced. |
format | Online Article Text |
id | pubmed-8421169 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2021 |
publisher | Hindawi |
record_format | MEDLINE/PubMed |
spelling | pubmed-84211692021-09-07 Hybrid Gradient Descent Grey Wolf Optimizer for Optimal Feature Selection Kitonyi, Peter Mule Segera, Davies Rene Biomed Res Int Research Article Feature selection is the process of decreasing the number of features in a dataset by removing redundant, irrelevant, and randomly class-corrected data features. By applying feature selection on large and highly dimensional datasets, the redundant features are removed, reducing the complexity of the data and reducing training time. The objective of this paper was to design an optimizer that combines the well-known metaheuristic population-based optimizer, the grey wolf algorithm, and the gradient descent algorithm and test it for applications in feature selection problems. The proposed algorithm was first compared against the original grey wolf algorithm in 23 continuous test functions. The proposed optimizer was altered for feature selection, and 3 binary implementations were developed with final implementation compared against the two implementations of the binary grey wolf optimizer and binary grey wolf particle swarm optimizer on 6 medical datasets from the UCI machine learning repository, on metrics such as accuracy, size of feature subsets, F-measure, accuracy, precision, and sensitivity. The proposed optimizer outperformed the three other optimizers in 3 of the 6 datasets in average metrics. The proposed optimizer showed promise in its capability to balance the two objectives in feature selection and could be further enhanced. Hindawi 2021-08-28 /pmc/articles/PMC8421169/ /pubmed/34497846 http://dx.doi.org/10.1155/2021/2555622 Text en Copyright © 2021 Peter Mule Kitonyi and Davies Rene Segera. https://creativecommons.org/licenses/by/4.0/This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. |
spellingShingle | Research Article Kitonyi, Peter Mule Segera, Davies Rene Hybrid Gradient Descent Grey Wolf Optimizer for Optimal Feature Selection |
title | Hybrid Gradient Descent Grey Wolf Optimizer for Optimal Feature Selection |
title_full | Hybrid Gradient Descent Grey Wolf Optimizer for Optimal Feature Selection |
title_fullStr | Hybrid Gradient Descent Grey Wolf Optimizer for Optimal Feature Selection |
title_full_unstemmed | Hybrid Gradient Descent Grey Wolf Optimizer for Optimal Feature Selection |
title_short | Hybrid Gradient Descent Grey Wolf Optimizer for Optimal Feature Selection |
title_sort | hybrid gradient descent grey wolf optimizer for optimal feature selection |
topic | Research Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8421169/ https://www.ncbi.nlm.nih.gov/pubmed/34497846 http://dx.doi.org/10.1155/2021/2555622 |
work_keys_str_mv | AT kitonyipetermule hybridgradientdescentgreywolfoptimizerforoptimalfeatureselection AT segeradaviesrene hybridgradientdescentgreywolfoptimizerforoptimalfeatureselection |