Cargando…
Contrastive learning-based pretraining improves representation and transferability of diabetic retinopathy classification models
Diabetic retinopathy (DR) is a major cause of vision impairment in diabetic patients worldwide. Due to its prevalence, early clinical diagnosis is essential to improve treatment management of DR patients. Despite recent demonstration of successful machine learning (ML) models for automated DR detect...
Autores principales: | , , , , , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Nature Publishing Group UK
2023
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10102012/ https://www.ncbi.nlm.nih.gov/pubmed/37055475 http://dx.doi.org/10.1038/s41598-023-33365-y |
_version_ | 1785025611254726656 |
---|---|
author | Alam, Minhaj Nur Yamashita, Rikiya Ramesh, Vignav Prabhune, Tejas Lim, Jennifer I. Chan, R. V. P. Hallak, Joelle Leng, Theodore Rubin, Daniel |
author_facet | Alam, Minhaj Nur Yamashita, Rikiya Ramesh, Vignav Prabhune, Tejas Lim, Jennifer I. Chan, R. V. P. Hallak, Joelle Leng, Theodore Rubin, Daniel |
author_sort | Alam, Minhaj Nur |
collection | PubMed |
description | Diabetic retinopathy (DR) is a major cause of vision impairment in diabetic patients worldwide. Due to its prevalence, early clinical diagnosis is essential to improve treatment management of DR patients. Despite recent demonstration of successful machine learning (ML) models for automated DR detection, there is a significant clinical need for robust models that can be trained with smaller cohorts of dataset and still perform with high diagnostic accuracy in independent clinical datasets (i.e., high model generalizability). Towards this need, we have developed a self-supervised contrastive learning (CL) based pipeline for classification of referable vs non-referable DR. Self-supervised CL based pretraining allows enhanced data representation, therefore, the development of robust and generalized deep learning (DL) models, even with small, labeled datasets. We have integrated a neural style transfer (NST) augmentation in the CL pipeline to produce models with better representations and initializations for the detection of DR in color fundus images. We compare our CL pretrained model performance with two state of the art baseline models pretrained with Imagenet weights. We further investigate the model performance with reduced labeled training data (down to 10 percent) to test the robustness of the model when trained with small, labeled datasets. The model is trained and validated on the EyePACS dataset and tested independently on clinical datasets from the University of Illinois, Chicago (UIC). Compared to baseline models, our CL pretrained FundusNet model had higher area under the receiver operating characteristics (ROC) curve (AUC) (CI) values (0.91 (0.898 to 0.930) vs 0.80 (0.783 to 0.820) and 0.83 (0.801 to 0.853) on UIC data). At 10 percent labeled training data, the FundusNet AUC was 0.81 (0.78 to 0.84) vs 0.58 (0.56 to 0.64) and 0.63 (0.60 to 0.66) in baseline models, when tested on the UIC dataset. CL based pretraining with NST significantly improves DL classification performance, helps the model generalize well (transferable from EyePACS to UIC data), and allows training with small, annotated datasets, therefore reducing ground truth annotation burden of the clinicians. |
format | Online Article Text |
id | pubmed-10102012 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2023 |
publisher | Nature Publishing Group UK |
record_format | MEDLINE/PubMed |
spelling | pubmed-101020122023-04-15 Contrastive learning-based pretraining improves representation and transferability of diabetic retinopathy classification models Alam, Minhaj Nur Yamashita, Rikiya Ramesh, Vignav Prabhune, Tejas Lim, Jennifer I. Chan, R. V. P. Hallak, Joelle Leng, Theodore Rubin, Daniel Sci Rep Article Diabetic retinopathy (DR) is a major cause of vision impairment in diabetic patients worldwide. Due to its prevalence, early clinical diagnosis is essential to improve treatment management of DR patients. Despite recent demonstration of successful machine learning (ML) models for automated DR detection, there is a significant clinical need for robust models that can be trained with smaller cohorts of dataset and still perform with high diagnostic accuracy in independent clinical datasets (i.e., high model generalizability). Towards this need, we have developed a self-supervised contrastive learning (CL) based pipeline for classification of referable vs non-referable DR. Self-supervised CL based pretraining allows enhanced data representation, therefore, the development of robust and generalized deep learning (DL) models, even with small, labeled datasets. We have integrated a neural style transfer (NST) augmentation in the CL pipeline to produce models with better representations and initializations for the detection of DR in color fundus images. We compare our CL pretrained model performance with two state of the art baseline models pretrained with Imagenet weights. We further investigate the model performance with reduced labeled training data (down to 10 percent) to test the robustness of the model when trained with small, labeled datasets. The model is trained and validated on the EyePACS dataset and tested independently on clinical datasets from the University of Illinois, Chicago (UIC). Compared to baseline models, our CL pretrained FundusNet model had higher area under the receiver operating characteristics (ROC) curve (AUC) (CI) values (0.91 (0.898 to 0.930) vs 0.80 (0.783 to 0.820) and 0.83 (0.801 to 0.853) on UIC data). At 10 percent labeled training data, the FundusNet AUC was 0.81 (0.78 to 0.84) vs 0.58 (0.56 to 0.64) and 0.63 (0.60 to 0.66) in baseline models, when tested on the UIC dataset. CL based pretraining with NST significantly improves DL classification performance, helps the model generalize well (transferable from EyePACS to UIC data), and allows training with small, annotated datasets, therefore reducing ground truth annotation burden of the clinicians. Nature Publishing Group UK 2023-04-13 /pmc/articles/PMC10102012/ /pubmed/37055475 http://dx.doi.org/10.1038/s41598-023-33365-y Text en © The Author(s) 2023 https://creativecommons.org/licenses/by/4.0/Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) . |
spellingShingle | Article Alam, Minhaj Nur Yamashita, Rikiya Ramesh, Vignav Prabhune, Tejas Lim, Jennifer I. Chan, R. V. P. Hallak, Joelle Leng, Theodore Rubin, Daniel Contrastive learning-based pretraining improves representation and transferability of diabetic retinopathy classification models |
title | Contrastive learning-based pretraining improves representation and transferability of diabetic retinopathy classification models |
title_full | Contrastive learning-based pretraining improves representation and transferability of diabetic retinopathy classification models |
title_fullStr | Contrastive learning-based pretraining improves representation and transferability of diabetic retinopathy classification models |
title_full_unstemmed | Contrastive learning-based pretraining improves representation and transferability of diabetic retinopathy classification models |
title_short | Contrastive learning-based pretraining improves representation and transferability of diabetic retinopathy classification models |
title_sort | contrastive learning-based pretraining improves representation and transferability of diabetic retinopathy classification models |
topic | Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10102012/ https://www.ncbi.nlm.nih.gov/pubmed/37055475 http://dx.doi.org/10.1038/s41598-023-33365-y |
work_keys_str_mv | AT alamminhajnur contrastivelearningbasedpretrainingimprovesrepresentationandtransferabilityofdiabeticretinopathyclassificationmodels AT yamashitarikiya contrastivelearningbasedpretrainingimprovesrepresentationandtransferabilityofdiabeticretinopathyclassificationmodels AT rameshvignav contrastivelearningbasedpretrainingimprovesrepresentationandtransferabilityofdiabeticretinopathyclassificationmodels AT prabhunetejas contrastivelearningbasedpretrainingimprovesrepresentationandtransferabilityofdiabeticretinopathyclassificationmodels AT limjenniferi contrastivelearningbasedpretrainingimprovesrepresentationandtransferabilityofdiabeticretinopathyclassificationmodels AT chanrvp contrastivelearningbasedpretrainingimprovesrepresentationandtransferabilityofdiabeticretinopathyclassificationmodels AT hallakjoelle contrastivelearningbasedpretrainingimprovesrepresentationandtransferabilityofdiabeticretinopathyclassificationmodels AT lengtheodore contrastivelearningbasedpretrainingimprovesrepresentationandtransferabilityofdiabeticretinopathyclassificationmodels AT rubindaniel contrastivelearningbasedpretrainingimprovesrepresentationandtransferabilityofdiabeticretinopathyclassificationmodels |