Cargando…
A Classification Method for the Cellular Images Based on Active Learning and Cross-Modal Transfer Learning
In computer-aided diagnosis (CAD) systems, the automatic classification of the different types of the human epithelial type 2 (HEp-2) cells represents one of the critical steps in the diagnosis procedure of autoimmune diseases. Most of the methods prefer to tackle this task using the supervised lear...
Autores principales: | , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
MDPI
2021
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7923434/ https://www.ncbi.nlm.nih.gov/pubmed/33672489 http://dx.doi.org/10.3390/s21041469 |
_version_ | 1783658901705588736 |
---|---|
author | Vununu, Caleb Lee, Suk-Hwan Kwon, Ki-Ryong |
author_facet | Vununu, Caleb Lee, Suk-Hwan Kwon, Ki-Ryong |
author_sort | Vununu, Caleb |
collection | PubMed |
description | In computer-aided diagnosis (CAD) systems, the automatic classification of the different types of the human epithelial type 2 (HEp-2) cells represents one of the critical steps in the diagnosis procedure of autoimmune diseases. Most of the methods prefer to tackle this task using the supervised learning paradigm. However, the necessity of having thousands of manually annotated examples constitutes a serious concern for the state-of-the-art HEp-2 cells classification methods. We present in this work a method that uses active learning in order to minimize the necessity of annotating the majority of the examples in the dataset. For this purpose, we use cross-modal transfer learning coupled with parallel deep residual networks. First, the parallel networks, which take simultaneously different wavelet coefficients as inputs, are trained in a fully supervised way by using a very small and already annotated dataset. Then, the trained networks are utilized on the targeted dataset, which is quite larger compared to the first one, using active learning techniques in order to only select the images that really need to be annotated among all the examples. The obtained results show that active learning, when mixed with an efficient transfer learning technique, can allow one to achieve a quite pleasant discrimination performance with only a few annotated examples in hands. This will help in building CAD systems by simplifying the burdensome task of labeling images while maintaining a similar performance with the state-of-the-art methods. |
format | Online Article Text |
id | pubmed-7923434 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2021 |
publisher | MDPI |
record_format | MEDLINE/PubMed |
spelling | pubmed-79234342021-03-03 A Classification Method for the Cellular Images Based on Active Learning and Cross-Modal Transfer Learning Vununu, Caleb Lee, Suk-Hwan Kwon, Ki-Ryong Sensors (Basel) Article In computer-aided diagnosis (CAD) systems, the automatic classification of the different types of the human epithelial type 2 (HEp-2) cells represents one of the critical steps in the diagnosis procedure of autoimmune diseases. Most of the methods prefer to tackle this task using the supervised learning paradigm. However, the necessity of having thousands of manually annotated examples constitutes a serious concern for the state-of-the-art HEp-2 cells classification methods. We present in this work a method that uses active learning in order to minimize the necessity of annotating the majority of the examples in the dataset. For this purpose, we use cross-modal transfer learning coupled with parallel deep residual networks. First, the parallel networks, which take simultaneously different wavelet coefficients as inputs, are trained in a fully supervised way by using a very small and already annotated dataset. Then, the trained networks are utilized on the targeted dataset, which is quite larger compared to the first one, using active learning techniques in order to only select the images that really need to be annotated among all the examples. The obtained results show that active learning, when mixed with an efficient transfer learning technique, can allow one to achieve a quite pleasant discrimination performance with only a few annotated examples in hands. This will help in building CAD systems by simplifying the burdensome task of labeling images while maintaining a similar performance with the state-of-the-art methods. MDPI 2021-02-20 /pmc/articles/PMC7923434/ /pubmed/33672489 http://dx.doi.org/10.3390/s21041469 Text en © 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/). |
spellingShingle | Article Vununu, Caleb Lee, Suk-Hwan Kwon, Ki-Ryong A Classification Method for the Cellular Images Based on Active Learning and Cross-Modal Transfer Learning |
title | A Classification Method for the Cellular Images Based on Active Learning and Cross-Modal Transfer Learning |
title_full | A Classification Method for the Cellular Images Based on Active Learning and Cross-Modal Transfer Learning |
title_fullStr | A Classification Method for the Cellular Images Based on Active Learning and Cross-Modal Transfer Learning |
title_full_unstemmed | A Classification Method for the Cellular Images Based on Active Learning and Cross-Modal Transfer Learning |
title_short | A Classification Method for the Cellular Images Based on Active Learning and Cross-Modal Transfer Learning |
title_sort | classification method for the cellular images based on active learning and cross-modal transfer learning |
topic | Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7923434/ https://www.ncbi.nlm.nih.gov/pubmed/33672489 http://dx.doi.org/10.3390/s21041469 |
work_keys_str_mv | AT vununucaleb aclassificationmethodforthecellularimagesbasedonactivelearningandcrossmodaltransferlearning AT leesukhwan aclassificationmethodforthecellularimagesbasedonactivelearningandcrossmodaltransferlearning AT kwonkiryong aclassificationmethodforthecellularimagesbasedonactivelearningandcrossmodaltransferlearning AT vununucaleb classificationmethodforthecellularimagesbasedonactivelearningandcrossmodaltransferlearning AT leesukhwan classificationmethodforthecellularimagesbasedonactivelearningandcrossmodaltransferlearning AT kwonkiryong classificationmethodforthecellularimagesbasedonactivelearningandcrossmodaltransferlearning |