Cargando…
Kernel Partial Least Squares Feature Selection Based on Maximum Weight Minimum Redundancy
Feature selection refers to a vital function in machine learning and data mining. The maximum weight minimum redundancy feature selection method not only considers the importance of features but also reduces the redundancy among features. However, the characteristics of various datasets are not iden...
Autores principales: | , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
MDPI
2023
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9955929/ https://www.ncbi.nlm.nih.gov/pubmed/36832691 http://dx.doi.org/10.3390/e25020325 |
_version_ | 1784894467729260544 |
---|---|
author | Liu, Xiling Zhou, Shuisheng |
author_facet | Liu, Xiling Zhou, Shuisheng |
author_sort | Liu, Xiling |
collection | PubMed |
description | Feature selection refers to a vital function in machine learning and data mining. The maximum weight minimum redundancy feature selection method not only considers the importance of features but also reduces the redundancy among features. However, the characteristics of various datasets are not identical, and thus the feature selection method should have different feature evaluation criteria for all datasets. Additionally, high-dimensional data analysis poses a challenge to enhancing the classification performance of the different feature selection methods. This study presents a kernel partial least squares feature selection method on the basis of the enhanced maximum weight minimum redundancy algorithm to simplify the calculation and improve the classification accuracy of high-dimensional datasets. By introducing a weight factor, the correlation between the maximum weight and the minimum redundancy in the evaluation criterion can be adjusted to develop an improved maximum weight minimum redundancy method. In this study, the proposed KPLS feature selection method considers the redundancy between the features and the feature weighting between any feature and a class label in different datasets. Moreover, the feature selection method proposed in this study has been tested regarding its classification accuracy on data containing noise and several datasets. The experimental findings achieved using different datasets explore the feasibility and effectiveness of the proposed method which can select an optimal feature subset and obtain great classification performance based on three different metrics when compared with other feature selection methods. |
format | Online Article Text |
id | pubmed-9955929 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2023 |
publisher | MDPI |
record_format | MEDLINE/PubMed |
spelling | pubmed-99559292023-02-25 Kernel Partial Least Squares Feature Selection Based on Maximum Weight Minimum Redundancy Liu, Xiling Zhou, Shuisheng Entropy (Basel) Article Feature selection refers to a vital function in machine learning and data mining. The maximum weight minimum redundancy feature selection method not only considers the importance of features but also reduces the redundancy among features. However, the characteristics of various datasets are not identical, and thus the feature selection method should have different feature evaluation criteria for all datasets. Additionally, high-dimensional data analysis poses a challenge to enhancing the classification performance of the different feature selection methods. This study presents a kernel partial least squares feature selection method on the basis of the enhanced maximum weight minimum redundancy algorithm to simplify the calculation and improve the classification accuracy of high-dimensional datasets. By introducing a weight factor, the correlation between the maximum weight and the minimum redundancy in the evaluation criterion can be adjusted to develop an improved maximum weight minimum redundancy method. In this study, the proposed KPLS feature selection method considers the redundancy between the features and the feature weighting between any feature and a class label in different datasets. Moreover, the feature selection method proposed in this study has been tested regarding its classification accuracy on data containing noise and several datasets. The experimental findings achieved using different datasets explore the feasibility and effectiveness of the proposed method which can select an optimal feature subset and obtain great classification performance based on three different metrics when compared with other feature selection methods. MDPI 2023-02-10 /pmc/articles/PMC9955929/ /pubmed/36832691 http://dx.doi.org/10.3390/e25020325 Text en © 2023 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). |
spellingShingle | Article Liu, Xiling Zhou, Shuisheng Kernel Partial Least Squares Feature Selection Based on Maximum Weight Minimum Redundancy |
title | Kernel Partial Least Squares Feature Selection Based on Maximum Weight Minimum Redundancy |
title_full | Kernel Partial Least Squares Feature Selection Based on Maximum Weight Minimum Redundancy |
title_fullStr | Kernel Partial Least Squares Feature Selection Based on Maximum Weight Minimum Redundancy |
title_full_unstemmed | Kernel Partial Least Squares Feature Selection Based on Maximum Weight Minimum Redundancy |
title_short | Kernel Partial Least Squares Feature Selection Based on Maximum Weight Minimum Redundancy |
title_sort | kernel partial least squares feature selection based on maximum weight minimum redundancy |
topic | Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9955929/ https://www.ncbi.nlm.nih.gov/pubmed/36832691 http://dx.doi.org/10.3390/e25020325 |
work_keys_str_mv | AT liuxiling kernelpartialleastsquaresfeatureselectionbasedonmaximumweightminimumredundancy AT zhoushuisheng kernelpartialleastsquaresfeatureselectionbasedonmaximumweightminimumredundancy |