Cargando…

Optimal combination of feature selection and classification via local hyperplane based learning strategy

BACKGROUND: Classifying cancers by gene selection is among the most important and challenging procedures in biomedicine. A major challenge is to design an effective method that eliminates irrelevant, redundant, or noisy genes from the classification, while retaining all of the highly discriminative...

Descripción completa

Detalles Bibliográficos
Autores principales: Cheng, Xiaoping, Cai, Hongmin, Zhang, Yue, Xu, Bo, Su, Weifeng
Formato: Online Artículo Texto
Lenguaje:English
Publicado: BioMed Central 2015
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4498526/
https://www.ncbi.nlm.nih.gov/pubmed/26159165
http://dx.doi.org/10.1186/s12859-015-0629-6
_version_ 1782380637186949120
author Cheng, Xiaoping
Cai, Hongmin
Zhang, Yue
Xu, Bo
Su, Weifeng
author_facet Cheng, Xiaoping
Cai, Hongmin
Zhang, Yue
Xu, Bo
Su, Weifeng
author_sort Cheng, Xiaoping
collection PubMed
description BACKGROUND: Classifying cancers by gene selection is among the most important and challenging procedures in biomedicine. A major challenge is to design an effective method that eliminates irrelevant, redundant, or noisy genes from the classification, while retaining all of the highly discriminative genes. RESULTS: We propose a gene selection method, called local hyperplane-based discriminant analysis (LHDA). LHDA adopts two central ideas. First, it uses a local approximation rather than global measurement; second, it embeds a recently reported classification model, K-Local Hyperplane Distance Nearest Neighbor(HKNN) classifier, into its discriminator. Through classification accuracy-based iterations, LHDA obtains the feature weight vector and finally extracts the optimal feature subset. The performance of the proposed method is evaluated in extensive experiments on synthetic and real microarray benchmark datasets. Eight classical feature selection methods, four classification models and two popular embedded learning schemes, including k-nearest neighbor (KNN), hyperplane k-nearest neighbor (HKNN), Support Vector Machine (SVM) and Random Forest are employed for comparisons. CONCLUSION: The proposed method yielded comparable to or superior performances to seven state-of-the-art models. The nice performance demonstrate the superiority of combining feature weighting with model learning into an unified framework to achieve the two tasks simultaneously. ELECTRONIC SUPPLEMENTARY MATERIAL: The online version of this article (doi:10.1186/s12859-015-0629-6) contains supplementary material, which is available to authorized users.
format Online
Article
Text
id pubmed-4498526
institution National Center for Biotechnology Information
language English
publishDate 2015
publisher BioMed Central
record_format MEDLINE/PubMed
spelling pubmed-44985262015-07-11 Optimal combination of feature selection and classification via local hyperplane based learning strategy Cheng, Xiaoping Cai, Hongmin Zhang, Yue Xu, Bo Su, Weifeng BMC Bioinformatics Methodology Article BACKGROUND: Classifying cancers by gene selection is among the most important and challenging procedures in biomedicine. A major challenge is to design an effective method that eliminates irrelevant, redundant, or noisy genes from the classification, while retaining all of the highly discriminative genes. RESULTS: We propose a gene selection method, called local hyperplane-based discriminant analysis (LHDA). LHDA adopts two central ideas. First, it uses a local approximation rather than global measurement; second, it embeds a recently reported classification model, K-Local Hyperplane Distance Nearest Neighbor(HKNN) classifier, into its discriminator. Through classification accuracy-based iterations, LHDA obtains the feature weight vector and finally extracts the optimal feature subset. The performance of the proposed method is evaluated in extensive experiments on synthetic and real microarray benchmark datasets. Eight classical feature selection methods, four classification models and two popular embedded learning schemes, including k-nearest neighbor (KNN), hyperplane k-nearest neighbor (HKNN), Support Vector Machine (SVM) and Random Forest are employed for comparisons. CONCLUSION: The proposed method yielded comparable to or superior performances to seven state-of-the-art models. The nice performance demonstrate the superiority of combining feature weighting with model learning into an unified framework to achieve the two tasks simultaneously. ELECTRONIC SUPPLEMENTARY MATERIAL: The online version of this article (doi:10.1186/s12859-015-0629-6) contains supplementary material, which is available to authorized users. BioMed Central 2015-07-10 /pmc/articles/PMC4498526/ /pubmed/26159165 http://dx.doi.org/10.1186/s12859-015-0629-6 Text en © Cheng et al. 2015 This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly credited. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.
spellingShingle Methodology Article
Cheng, Xiaoping
Cai, Hongmin
Zhang, Yue
Xu, Bo
Su, Weifeng
Optimal combination of feature selection and classification via local hyperplane based learning strategy
title Optimal combination of feature selection and classification via local hyperplane based learning strategy
title_full Optimal combination of feature selection and classification via local hyperplane based learning strategy
title_fullStr Optimal combination of feature selection and classification via local hyperplane based learning strategy
title_full_unstemmed Optimal combination of feature selection and classification via local hyperplane based learning strategy
title_short Optimal combination of feature selection and classification via local hyperplane based learning strategy
title_sort optimal combination of feature selection and classification via local hyperplane based learning strategy
topic Methodology Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4498526/
https://www.ncbi.nlm.nih.gov/pubmed/26159165
http://dx.doi.org/10.1186/s12859-015-0629-6
work_keys_str_mv AT chengxiaoping optimalcombinationoffeatureselectionandclassificationvialocalhyperplanebasedlearningstrategy
AT caihongmin optimalcombinationoffeatureselectionandclassificationvialocalhyperplanebasedlearningstrategy
AT zhangyue optimalcombinationoffeatureselectionandclassificationvialocalhyperplanebasedlearningstrategy
AT xubo optimalcombinationoffeatureselectionandclassificationvialocalhyperplanebasedlearningstrategy
AT suweifeng optimalcombinationoffeatureselectionandclassificationvialocalhyperplanebasedlearningstrategy