Cargando…
A Semi-Supervised Stacked Autoencoder Using the Pseudo Label for Classification Tasks
The efficiency and cognitive limitations of manual sample labeling result in a large number of unlabeled training samples in practical applications. Making full use of both labeled and unlabeled samples is the key to solving the semi-supervised problem. However, as a supervised algorithm, the stacke...
Autores principales: | , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
MDPI
2023
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10528325/ https://www.ncbi.nlm.nih.gov/pubmed/37761573 http://dx.doi.org/10.3390/e25091274 |
_version_ | 1785111251556237312 |
---|---|
author | Lai, Jie Wang, Xiaodan Xiang, Qian Quan, Wen Song, Yafei |
author_facet | Lai, Jie Wang, Xiaodan Xiang, Qian Quan, Wen Song, Yafei |
author_sort | Lai, Jie |
collection | PubMed |
description | The efficiency and cognitive limitations of manual sample labeling result in a large number of unlabeled training samples in practical applications. Making full use of both labeled and unlabeled samples is the key to solving the semi-supervised problem. However, as a supervised algorithm, the stacked autoencoder (SAE) only considers labeled samples and is difficult to apply to semi-supervised problems. Thus, by introducing the pseudo-labeling method into the SAE, a novel pseudo label-based semi-supervised stacked autoencoder (PL-SSAE) is proposed to address the semi-supervised classification tasks. The PL-SSAE first utilizes the unsupervised pre-training on all samples by the autoencoder (AE) to initialize the network parameters. Then, by the iterative fine-tuning of the network parameters based on the labeled samples, the unlabeled samples are identified, and their pseudo labels are generated. Finally, the pseudo-labeled samples are used to construct the regularization term and fine-tune the network parameters to complete the training of the PL-SSAE. Different from the traditional SAE, the PL-SSAE requires all samples in pre-training and the unlabeled samples with pseudo labels in fine-tuning to fully exploit the feature and category information of the unlabeled samples. Empirical evaluations on various benchmark datasets show that the semi-supervised performance of the PL-SSAE is more competitive than that of the SAE, sparse stacked autoencoder (SSAE), semi-supervised stacked autoencoder (Semi-SAE) and semi-supervised stacked autoencoder (Semi-SSAE). |
format | Online Article Text |
id | pubmed-10528325 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2023 |
publisher | MDPI |
record_format | MEDLINE/PubMed |
spelling | pubmed-105283252023-09-28 A Semi-Supervised Stacked Autoencoder Using the Pseudo Label for Classification Tasks Lai, Jie Wang, Xiaodan Xiang, Qian Quan, Wen Song, Yafei Entropy (Basel) Article The efficiency and cognitive limitations of manual sample labeling result in a large number of unlabeled training samples in practical applications. Making full use of both labeled and unlabeled samples is the key to solving the semi-supervised problem. However, as a supervised algorithm, the stacked autoencoder (SAE) only considers labeled samples and is difficult to apply to semi-supervised problems. Thus, by introducing the pseudo-labeling method into the SAE, a novel pseudo label-based semi-supervised stacked autoencoder (PL-SSAE) is proposed to address the semi-supervised classification tasks. The PL-SSAE first utilizes the unsupervised pre-training on all samples by the autoencoder (AE) to initialize the network parameters. Then, by the iterative fine-tuning of the network parameters based on the labeled samples, the unlabeled samples are identified, and their pseudo labels are generated. Finally, the pseudo-labeled samples are used to construct the regularization term and fine-tune the network parameters to complete the training of the PL-SSAE. Different from the traditional SAE, the PL-SSAE requires all samples in pre-training and the unlabeled samples with pseudo labels in fine-tuning to fully exploit the feature and category information of the unlabeled samples. Empirical evaluations on various benchmark datasets show that the semi-supervised performance of the PL-SSAE is more competitive than that of the SAE, sparse stacked autoencoder (SSAE), semi-supervised stacked autoencoder (Semi-SAE) and semi-supervised stacked autoencoder (Semi-SSAE). MDPI 2023-08-30 /pmc/articles/PMC10528325/ /pubmed/37761573 http://dx.doi.org/10.3390/e25091274 Text en © 2023 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). |
spellingShingle | Article Lai, Jie Wang, Xiaodan Xiang, Qian Quan, Wen Song, Yafei A Semi-Supervised Stacked Autoencoder Using the Pseudo Label for Classification Tasks |
title | A Semi-Supervised Stacked Autoencoder Using the Pseudo Label for Classification Tasks |
title_full | A Semi-Supervised Stacked Autoencoder Using the Pseudo Label for Classification Tasks |
title_fullStr | A Semi-Supervised Stacked Autoencoder Using the Pseudo Label for Classification Tasks |
title_full_unstemmed | A Semi-Supervised Stacked Autoencoder Using the Pseudo Label for Classification Tasks |
title_short | A Semi-Supervised Stacked Autoencoder Using the Pseudo Label for Classification Tasks |
title_sort | semi-supervised stacked autoencoder using the pseudo label for classification tasks |
topic | Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10528325/ https://www.ncbi.nlm.nih.gov/pubmed/37761573 http://dx.doi.org/10.3390/e25091274 |
work_keys_str_mv | AT laijie asemisupervisedstackedautoencoderusingthepseudolabelforclassificationtasks AT wangxiaodan asemisupervisedstackedautoencoderusingthepseudolabelforclassificationtasks AT xiangqian asemisupervisedstackedautoencoderusingthepseudolabelforclassificationtasks AT quanwen asemisupervisedstackedautoencoderusingthepseudolabelforclassificationtasks AT songyafei asemisupervisedstackedautoencoderusingthepseudolabelforclassificationtasks AT laijie semisupervisedstackedautoencoderusingthepseudolabelforclassificationtasks AT wangxiaodan semisupervisedstackedautoencoderusingthepseudolabelforclassificationtasks AT xiangqian semisupervisedstackedautoencoderusingthepseudolabelforclassificationtasks AT quanwen semisupervisedstackedautoencoderusingthepseudolabelforclassificationtasks AT songyafei semisupervisedstackedautoencoderusingthepseudolabelforclassificationtasks |