Cargando…

Noise-injected neural networks show promise for use on small-sample expression data

BACKGROUND: Overfitting the data is a salient issue for classifier design in small-sample settings. This is why selecting a classifier from a constrained family of classifiers, ones that do not possess the potential to too finely partition the feature space, is typically preferable. But overfitting...

Descripción completa

Detalles Bibliográficos
Autores principales: Hua, Jianping, Lowey, James, Xiong, Zixiang, Dougherty, Edward R
Formato: Texto
Lenguaje:English
Publicado: BioMed Central 2006
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC1524820/
https://www.ncbi.nlm.nih.gov/pubmed/16737545
http://dx.doi.org/10.1186/1471-2105-7-274
_version_ 1782128853637922816
author Hua, Jianping
Lowey, James
Xiong, Zixiang
Dougherty, Edward R
author_facet Hua, Jianping
Lowey, James
Xiong, Zixiang
Dougherty, Edward R
author_sort Hua, Jianping
collection PubMed
description BACKGROUND: Overfitting the data is a salient issue for classifier design in small-sample settings. This is why selecting a classifier from a constrained family of classifiers, ones that do not possess the potential to too finely partition the feature space, is typically preferable. But overfitting is not merely a consequence of the classifier family; it is highly dependent on the classification rule used to design a classifier from the sample data. Thus, it is possible to consider families that are rather complex but for which there are classification rules that perform well for small samples. Such classification rules can be advantageous because they facilitate satisfactory classification when the class-conditional distributions are not easily separated and the sample is not large. Here we consider neural networks, from the perspectives of classical design based solely on the sample data and from noise-injection-based design. RESULTS: This paper provides an extensive simulation-based comparative study of noise-injected neural-network design. It considers a number of different feature-label models across various small sample sizes using varying amounts of noise injection. Besides comparing noise-injected neural-network design to classical neural-network design, the paper compares it to a number of other classification rules. Our particular interest is with the use of microarray data for expression-based classification for diagnosis and prognosis. To that end, we consider noise-injected neural-network design as it relates to a study of survivability of breast cancer patients. CONCLUSION: The conclusion is that in many instances noise-injected neural network design is superior to the other tested methods, and in almost all cases it does not perform substantially worse than the best of the other methods. Since the amount of noise injected is consequential, the effect of differing amounts of injected noise must be considered.
format Text
id pubmed-1524820
institution National Center for Biotechnology Information
language English
publishDate 2006
publisher BioMed Central
record_format MEDLINE/PubMed
spelling pubmed-15248202006-08-01 Noise-injected neural networks show promise for use on small-sample expression data Hua, Jianping Lowey, James Xiong, Zixiang Dougherty, Edward R BMC Bioinformatics Research Article BACKGROUND: Overfitting the data is a salient issue for classifier design in small-sample settings. This is why selecting a classifier from a constrained family of classifiers, ones that do not possess the potential to too finely partition the feature space, is typically preferable. But overfitting is not merely a consequence of the classifier family; it is highly dependent on the classification rule used to design a classifier from the sample data. Thus, it is possible to consider families that are rather complex but for which there are classification rules that perform well for small samples. Such classification rules can be advantageous because they facilitate satisfactory classification when the class-conditional distributions are not easily separated and the sample is not large. Here we consider neural networks, from the perspectives of classical design based solely on the sample data and from noise-injection-based design. RESULTS: This paper provides an extensive simulation-based comparative study of noise-injected neural-network design. It considers a number of different feature-label models across various small sample sizes using varying amounts of noise injection. Besides comparing noise-injected neural-network design to classical neural-network design, the paper compares it to a number of other classification rules. Our particular interest is with the use of microarray data for expression-based classification for diagnosis and prognosis. To that end, we consider noise-injected neural-network design as it relates to a study of survivability of breast cancer patients. CONCLUSION: The conclusion is that in many instances noise-injected neural network design is superior to the other tested methods, and in almost all cases it does not perform substantially worse than the best of the other methods. Since the amount of noise injected is consequential, the effect of differing amounts of injected noise must be considered. BioMed Central 2006-05-31 /pmc/articles/PMC1524820/ /pubmed/16737545 http://dx.doi.org/10.1186/1471-2105-7-274 Text en Copyright © 2006 Hua et al; licensee BioMed Central Ltd. http://creativecommons.org/licenses/by/2.0 This is an Open Access article distributed under the terms of the Creative Commons Attribution License ( (http://creativecommons.org/licenses/by/2.0) ), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
spellingShingle Research Article
Hua, Jianping
Lowey, James
Xiong, Zixiang
Dougherty, Edward R
Noise-injected neural networks show promise for use on small-sample expression data
title Noise-injected neural networks show promise for use on small-sample expression data
title_full Noise-injected neural networks show promise for use on small-sample expression data
title_fullStr Noise-injected neural networks show promise for use on small-sample expression data
title_full_unstemmed Noise-injected neural networks show promise for use on small-sample expression data
title_short Noise-injected neural networks show promise for use on small-sample expression data
title_sort noise-injected neural networks show promise for use on small-sample expression data
topic Research Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC1524820/
https://www.ncbi.nlm.nih.gov/pubmed/16737545
http://dx.doi.org/10.1186/1471-2105-7-274
work_keys_str_mv AT huajianping noiseinjectedneuralnetworksshowpromiseforuseonsmallsampleexpressiondata
AT loweyjames noiseinjectedneuralnetworksshowpromiseforuseonsmallsampleexpressiondata
AT xiongzixiang noiseinjectedneuralnetworksshowpromiseforuseonsmallsampleexpressiondata
AT doughertyedwardr noiseinjectedneuralnetworksshowpromiseforuseonsmallsampleexpressiondata