Cargando…

Feature Selection Based on Adaptive Particle Swarm Optimization with Leadership Learning

With the rapid development of the Internet of Things (IoT), the curse of dimensionality becomes increasingly common. Feature selection (FS) is to eliminate irrelevant and redundant features in the datasets. Particle swarm optimization (PSO) is an efficient metaheuristic algorithm that has been succe...

Descripción completa

Detalles Bibliográficos
Autores principales: Ye, Zhiwei, Xu, Yi, He, Qiyi, Wang, Mingwei, Bai, Wanfang, Xiao, Hongwei
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Hindawi 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9441366/
https://www.ncbi.nlm.nih.gov/pubmed/36072739
http://dx.doi.org/10.1155/2022/1825341
_version_ 1784782556694052864
author Ye, Zhiwei
Xu, Yi
He, Qiyi
Wang, Mingwei
Bai, Wanfang
Xiao, Hongwei
author_facet Ye, Zhiwei
Xu, Yi
He, Qiyi
Wang, Mingwei
Bai, Wanfang
Xiao, Hongwei
author_sort Ye, Zhiwei
collection PubMed
description With the rapid development of the Internet of Things (IoT), the curse of dimensionality becomes increasingly common. Feature selection (FS) is to eliminate irrelevant and redundant features in the datasets. Particle swarm optimization (PSO) is an efficient metaheuristic algorithm that has been successfully applied to obtain the optimal feature subset with essential information in an acceptable time. However, it is easy to fall into the local optima when dealing with high-dimensional datasets due to constant parameter values and insufficient population diversity. In the paper, an FS method is proposed by utilizing adaptive PSO with leadership learning (APSOLL). An adaptive updating strategy for parameters is used to replace the constant parameters, and the leadership learning strategy is utilized to provide valid population diversity. Experimental results on 10 UCI datasets show that APSOLL has better exploration and exploitation capabilities through comparison with PSO, grey wolf optimizer (GWO), Harris hawks optimization (HHO), flower pollination algorithm (FPA), salp swarm algorithm (SSA), linear PSO (LPSO), and hybrid PSO and differential evolution (HPSO-DE). Moreover, less than 8% of features in the original datasets are selected on average, and the feature subsets are more effective in most cases compared to those generated by 6 traditional FS methods (analysis of variance (ANOVA), Chi-Squared (CHI2), Pearson, Spearman, Kendall, and Mutual Information (MI)).
format Online
Article
Text
id pubmed-9441366
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher Hindawi
record_format MEDLINE/PubMed
spelling pubmed-94413662022-09-06 Feature Selection Based on Adaptive Particle Swarm Optimization with Leadership Learning Ye, Zhiwei Xu, Yi He, Qiyi Wang, Mingwei Bai, Wanfang Xiao, Hongwei Comput Intell Neurosci Research Article With the rapid development of the Internet of Things (IoT), the curse of dimensionality becomes increasingly common. Feature selection (FS) is to eliminate irrelevant and redundant features in the datasets. Particle swarm optimization (PSO) is an efficient metaheuristic algorithm that has been successfully applied to obtain the optimal feature subset with essential information in an acceptable time. However, it is easy to fall into the local optima when dealing with high-dimensional datasets due to constant parameter values and insufficient population diversity. In the paper, an FS method is proposed by utilizing adaptive PSO with leadership learning (APSOLL). An adaptive updating strategy for parameters is used to replace the constant parameters, and the leadership learning strategy is utilized to provide valid population diversity. Experimental results on 10 UCI datasets show that APSOLL has better exploration and exploitation capabilities through comparison with PSO, grey wolf optimizer (GWO), Harris hawks optimization (HHO), flower pollination algorithm (FPA), salp swarm algorithm (SSA), linear PSO (LPSO), and hybrid PSO and differential evolution (HPSO-DE). Moreover, less than 8% of features in the original datasets are selected on average, and the feature subsets are more effective in most cases compared to those generated by 6 traditional FS methods (analysis of variance (ANOVA), Chi-Squared (CHI2), Pearson, Spearman, Kendall, and Mutual Information (MI)). Hindawi 2022-08-28 /pmc/articles/PMC9441366/ /pubmed/36072739 http://dx.doi.org/10.1155/2022/1825341 Text en Copyright © 2022 Zhiwei Ye et al. https://creativecommons.org/licenses/by/4.0/This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
spellingShingle Research Article
Ye, Zhiwei
Xu, Yi
He, Qiyi
Wang, Mingwei
Bai, Wanfang
Xiao, Hongwei
Feature Selection Based on Adaptive Particle Swarm Optimization with Leadership Learning
title Feature Selection Based on Adaptive Particle Swarm Optimization with Leadership Learning
title_full Feature Selection Based on Adaptive Particle Swarm Optimization with Leadership Learning
title_fullStr Feature Selection Based on Adaptive Particle Swarm Optimization with Leadership Learning
title_full_unstemmed Feature Selection Based on Adaptive Particle Swarm Optimization with Leadership Learning
title_short Feature Selection Based on Adaptive Particle Swarm Optimization with Leadership Learning
title_sort feature selection based on adaptive particle swarm optimization with leadership learning
topic Research Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9441366/
https://www.ncbi.nlm.nih.gov/pubmed/36072739
http://dx.doi.org/10.1155/2022/1825341
work_keys_str_mv AT yezhiwei featureselectionbasedonadaptiveparticleswarmoptimizationwithleadershiplearning
AT xuyi featureselectionbasedonadaptiveparticleswarmoptimizationwithleadershiplearning
AT heqiyi featureselectionbasedonadaptiveparticleswarmoptimizationwithleadershiplearning
AT wangmingwei featureselectionbasedonadaptiveparticleswarmoptimizationwithleadershiplearning
AT baiwanfang featureselectionbasedonadaptiveparticleswarmoptimizationwithleadershiplearning
AT xiaohongwei featureselectionbasedonadaptiveparticleswarmoptimizationwithleadershiplearning