Cargando…

Distributed deep learning networks among institutions for medical imaging

OBJECTIVE: Deep learning has become a promising approach for automated support for clinical diagnosis. When medical data samples are limited, collaboration among multiple institutions is necessary to achieve high algorithm performance. However, sharing patient data often has limitations due to techn...

Descripción completa

Detalles Bibliográficos
Autores principales: Chang, Ken, Balachandar, Niranjan, Lam, Carson, Yi, Darvin, Brown, James, Beers, Andrew, Rosen, Bruce, Rubin, Daniel L, Kalpathy-Cramer, Jayashree
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Oxford University Press 2018
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6077811/
https://www.ncbi.nlm.nih.gov/pubmed/29617797
http://dx.doi.org/10.1093/jamia/ocy017
_version_ 1783344984542412800
author Chang, Ken
Balachandar, Niranjan
Lam, Carson
Yi, Darvin
Brown, James
Beers, Andrew
Rosen, Bruce
Rubin, Daniel L
Kalpathy-Cramer, Jayashree
author_facet Chang, Ken
Balachandar, Niranjan
Lam, Carson
Yi, Darvin
Brown, James
Beers, Andrew
Rosen, Bruce
Rubin, Daniel L
Kalpathy-Cramer, Jayashree
author_sort Chang, Ken
collection PubMed
description OBJECTIVE: Deep learning has become a promising approach for automated support for clinical diagnosis. When medical data samples are limited, collaboration among multiple institutions is necessary to achieve high algorithm performance. However, sharing patient data often has limitations due to technical, legal, or ethical concerns. In this study, we propose methods of distributing deep learning models as an attractive alternative to sharing patient data. METHODS: We simulate the distribution of deep learning models across 4 institutions using various training heuristics and compare the results with a deep learning model trained on centrally hosted patient data. The training heuristics investigated include ensembling single institution models, single weight transfer, and cyclical weight transfer. We evaluated these approaches for image classification in 3 independent image collections (retinal fundus photos, mammography, and ImageNet). RESULTS: We find that cyclical weight transfer resulted in a performance that was comparable to that of centrally hosted patient data. We also found that there is an improvement in the performance of cyclical weight transfer heuristic with a high frequency of weight transfer. CONCLUSIONS: We show that distributing deep learning models is an effective alternative to sharing patient data. This finding has implications for any collaborative deep learning study.
format Online
Article
Text
id pubmed-6077811
institution National Center for Biotechnology Information
language English
publishDate 2018
publisher Oxford University Press
record_format MEDLINE/PubMed
spelling pubmed-60778112018-08-09 Distributed deep learning networks among institutions for medical imaging Chang, Ken Balachandar, Niranjan Lam, Carson Yi, Darvin Brown, James Beers, Andrew Rosen, Bruce Rubin, Daniel L Kalpathy-Cramer, Jayashree J Am Med Inform Assoc Research and Applications OBJECTIVE: Deep learning has become a promising approach for automated support for clinical diagnosis. When medical data samples are limited, collaboration among multiple institutions is necessary to achieve high algorithm performance. However, sharing patient data often has limitations due to technical, legal, or ethical concerns. In this study, we propose methods of distributing deep learning models as an attractive alternative to sharing patient data. METHODS: We simulate the distribution of deep learning models across 4 institutions using various training heuristics and compare the results with a deep learning model trained on centrally hosted patient data. The training heuristics investigated include ensembling single institution models, single weight transfer, and cyclical weight transfer. We evaluated these approaches for image classification in 3 independent image collections (retinal fundus photos, mammography, and ImageNet). RESULTS: We find that cyclical weight transfer resulted in a performance that was comparable to that of centrally hosted patient data. We also found that there is an improvement in the performance of cyclical weight transfer heuristic with a high frequency of weight transfer. CONCLUSIONS: We show that distributing deep learning models is an effective alternative to sharing patient data. This finding has implications for any collaborative deep learning study. Oxford University Press 2018-03-29 /pmc/articles/PMC6077811/ /pubmed/29617797 http://dx.doi.org/10.1093/jamia/ocy017 Text en © The Author(s) 2018. Published by Oxford University Press on behalf of the American Medical Informatics Association. http://creativecommons.org/licenses/by-nc/4.0/ This is an Open Access article distributed under the terms of the Creative Commons Attribution Non-Commercial License (http://creativecommons.org/licenses/by-nc/4.0/), which permits non-commercial re-use, distribution, and reproduction in any medium, provided the original work is properly cited. For commercial re-use, please contact journals.permissions@oup.com
spellingShingle Research and Applications
Chang, Ken
Balachandar, Niranjan
Lam, Carson
Yi, Darvin
Brown, James
Beers, Andrew
Rosen, Bruce
Rubin, Daniel L
Kalpathy-Cramer, Jayashree
Distributed deep learning networks among institutions for medical imaging
title Distributed deep learning networks among institutions for medical imaging
title_full Distributed deep learning networks among institutions for medical imaging
title_fullStr Distributed deep learning networks among institutions for medical imaging
title_full_unstemmed Distributed deep learning networks among institutions for medical imaging
title_short Distributed deep learning networks among institutions for medical imaging
title_sort distributed deep learning networks among institutions for medical imaging
topic Research and Applications
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6077811/
https://www.ncbi.nlm.nih.gov/pubmed/29617797
http://dx.doi.org/10.1093/jamia/ocy017
work_keys_str_mv AT changken distributeddeeplearningnetworksamonginstitutionsformedicalimaging
AT balachandarniranjan distributeddeeplearningnetworksamonginstitutionsformedicalimaging
AT lamcarson distributeddeeplearningnetworksamonginstitutionsformedicalimaging
AT yidarvin distributeddeeplearningnetworksamonginstitutionsformedicalimaging
AT brownjames distributeddeeplearningnetworksamonginstitutionsformedicalimaging
AT beersandrew distributeddeeplearningnetworksamonginstitutionsformedicalimaging
AT rosenbruce distributeddeeplearningnetworksamonginstitutionsformedicalimaging
AT rubindaniell distributeddeeplearningnetworksamonginstitutionsformedicalimaging
AT kalpathycramerjayashree distributeddeeplearningnetworksamonginstitutionsformedicalimaging