Cargando…

Combining weakly and strongly supervised learning improves strong supervision in Gleason pattern classification

BACKGROUND: One challenge to train deep convolutional neural network (CNNs) models with whole slide images (WSIs) is providing the required large number of costly, manually annotated image regions. Strategies to alleviate the scarcity of annotated data include: using transfer learning, data augmenta...

Descripción completa

Detalles Bibliográficos
Autores principales: Otálora, Sebastian, Marini, Niccolò, Müller, Henning, Atzori, Manfredo
Formato: Online Artículo Texto
Lenguaje:English
Publicado: BioMed Central 2021
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8105943/
https://www.ncbi.nlm.nih.gov/pubmed/33964886
http://dx.doi.org/10.1186/s12880-021-00609-0
_version_ 1783689681937891328
author Otálora, Sebastian
Marini, Niccolò
Müller, Henning
Atzori, Manfredo
author_facet Otálora, Sebastian
Marini, Niccolò
Müller, Henning
Atzori, Manfredo
author_sort Otálora, Sebastian
collection PubMed
description BACKGROUND: One challenge to train deep convolutional neural network (CNNs) models with whole slide images (WSIs) is providing the required large number of costly, manually annotated image regions. Strategies to alleviate the scarcity of annotated data include: using transfer learning, data augmentation and training the models with less expensive image-level annotations (weakly-supervised learning). However, it is not clear how to combine the use of transfer learning in a CNN model when different data sources are available for training or how to leverage from the combination of large amounts of weakly annotated images with a set of local region annotations. This paper aims to evaluate CNN training strategies based on transfer learning to leverage the combination of weak and strong annotations in heterogeneous data sources. The trade-off between classification performance and annotation effort is explored by evaluating a CNN that learns from strong labels (region annotations) and is later fine-tuned on a dataset with less expensive weak (image-level) labels. RESULTS: As expected, the model performance on strongly annotated data steadily increases as the percentage of strong annotations that are used increases, reaching a performance comparable to pathologists ([Formula: see text] ). Nevertheless, the performance sharply decreases when applied for the WSI classification scenario with [Formula: see text] . Moreover, it only provides a lower performance regardless of the number of annotations used. The model performance increases when fine-tuning the model for the task of Gleason scoring with the weak WSI labels [Formula: see text] . CONCLUSION: Combining weak and strong supervision improves strong supervision in classification of Gleason patterns using tissue microarrays (TMA) and WSI regions. Our results contribute very good strategies for training CNN models combining few annotated data and heterogeneous data sources. The performance increases in the controlled TMA scenario with the number of annotations used to train the model. Nevertheless, the performance is hindered when the trained TMA model is applied directly to the more challenging WSI classification problem. This demonstrates that a good pre-trained model for prostate cancer TMA image classification may lead to the best downstream model if fine-tuned on the WSI target dataset. We have made available the source code repository for reproducing the experiments in the paper: https://github.com/ilmaro8/Digital_Pathology_Transfer_Learning
format Online
Article
Text
id pubmed-8105943
institution National Center for Biotechnology Information
language English
publishDate 2021
publisher BioMed Central
record_format MEDLINE/PubMed
spelling pubmed-81059432021-05-10 Combining weakly and strongly supervised learning improves strong supervision in Gleason pattern classification Otálora, Sebastian Marini, Niccolò Müller, Henning Atzori, Manfredo BMC Med Imaging Research BACKGROUND: One challenge to train deep convolutional neural network (CNNs) models with whole slide images (WSIs) is providing the required large number of costly, manually annotated image regions. Strategies to alleviate the scarcity of annotated data include: using transfer learning, data augmentation and training the models with less expensive image-level annotations (weakly-supervised learning). However, it is not clear how to combine the use of transfer learning in a CNN model when different data sources are available for training or how to leverage from the combination of large amounts of weakly annotated images with a set of local region annotations. This paper aims to evaluate CNN training strategies based on transfer learning to leverage the combination of weak and strong annotations in heterogeneous data sources. The trade-off between classification performance and annotation effort is explored by evaluating a CNN that learns from strong labels (region annotations) and is later fine-tuned on a dataset with less expensive weak (image-level) labels. RESULTS: As expected, the model performance on strongly annotated data steadily increases as the percentage of strong annotations that are used increases, reaching a performance comparable to pathologists ([Formula: see text] ). Nevertheless, the performance sharply decreases when applied for the WSI classification scenario with [Formula: see text] . Moreover, it only provides a lower performance regardless of the number of annotations used. The model performance increases when fine-tuning the model for the task of Gleason scoring with the weak WSI labels [Formula: see text] . CONCLUSION: Combining weak and strong supervision improves strong supervision in classification of Gleason patterns using tissue microarrays (TMA) and WSI regions. Our results contribute very good strategies for training CNN models combining few annotated data and heterogeneous data sources. The performance increases in the controlled TMA scenario with the number of annotations used to train the model. Nevertheless, the performance is hindered when the trained TMA model is applied directly to the more challenging WSI classification problem. This demonstrates that a good pre-trained model for prostate cancer TMA image classification may lead to the best downstream model if fine-tuned on the WSI target dataset. We have made available the source code repository for reproducing the experiments in the paper: https://github.com/ilmaro8/Digital_Pathology_Transfer_Learning BioMed Central 2021-05-08 /pmc/articles/PMC8105943/ /pubmed/33964886 http://dx.doi.org/10.1186/s12880-021-00609-0 Text en © The Author(s) 2021 https://creativecommons.org/licenses/by/4.0/Open AccessThis article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) . The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/ (https://creativecommons.org/publicdomain/zero/1.0/) ) applies to the data made available in this article, unless otherwise stated in a credit line to the data.
spellingShingle Research
Otálora, Sebastian
Marini, Niccolò
Müller, Henning
Atzori, Manfredo
Combining weakly and strongly supervised learning improves strong supervision in Gleason pattern classification
title Combining weakly and strongly supervised learning improves strong supervision in Gleason pattern classification
title_full Combining weakly and strongly supervised learning improves strong supervision in Gleason pattern classification
title_fullStr Combining weakly and strongly supervised learning improves strong supervision in Gleason pattern classification
title_full_unstemmed Combining weakly and strongly supervised learning improves strong supervision in Gleason pattern classification
title_short Combining weakly and strongly supervised learning improves strong supervision in Gleason pattern classification
title_sort combining weakly and strongly supervised learning improves strong supervision in gleason pattern classification
topic Research
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8105943/
https://www.ncbi.nlm.nih.gov/pubmed/33964886
http://dx.doi.org/10.1186/s12880-021-00609-0
work_keys_str_mv AT otalorasebastian combiningweaklyandstronglysupervisedlearningimprovesstrongsupervisioningleasonpatternclassification
AT marininiccolo combiningweaklyandstronglysupervisedlearningimprovesstrongsupervisioningleasonpatternclassification
AT mullerhenning combiningweaklyandstronglysupervisedlearningimprovesstrongsupervisioningleasonpatternclassification
AT atzorimanfredo combiningweaklyandstronglysupervisedlearningimprovesstrongsupervisioningleasonpatternclassification