Cargando…
Medical imaging deep learning with differential privacy
The successful training of deep learning models for diagnostic deployment in medical imaging applications requires large volumes of data. Such data cannot be procured without consideration for patient privacy, mandated both by legal regulations and ethical requirements of the medical profession. Dif...
Autores principales: | , , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Nature Publishing Group UK
2021
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8242021/ https://www.ncbi.nlm.nih.gov/pubmed/34188157 http://dx.doi.org/10.1038/s41598-021-93030-0 |
_version_ | 1783715538402279424 |
---|---|
author | Ziller, Alexander Usynin, Dmitrii Braren, Rickmer Makowski, Marcus Rueckert, Daniel Kaissis, Georgios |
author_facet | Ziller, Alexander Usynin, Dmitrii Braren, Rickmer Makowski, Marcus Rueckert, Daniel Kaissis, Georgios |
author_sort | Ziller, Alexander |
collection | PubMed |
description | The successful training of deep learning models for diagnostic deployment in medical imaging applications requires large volumes of data. Such data cannot be procured without consideration for patient privacy, mandated both by legal regulations and ethical requirements of the medical profession. Differential privacy (DP) enables the provision of information-theoretic privacy guarantees to patients and can be implemented in the setting of deep neural network training through the differentially private stochastic gradient descent (DP-SGD) algorithm. We here present deepee, a free-and-open-source framework for differentially private deep learning for use with the PyTorch deep learning framework. Our framework is based on parallelised execution of neural network operations to obtain and modify the per-sample gradients. The process is efficiently abstracted via a data structure maintaining shared memory references to neural network weights to maintain memory efficiency. We furthermore offer specialised data loading procedures and privacy budget accounting based on the Gaussian Differential Privacy framework, as well as automated modification of the user-supplied neural network architectures to ensure DP-conformity of its layers. We benchmark our framework’s computational performance against other open-source DP frameworks and evaluate its application on the paediatric pneumonia dataset, an image classification task and on the Medical Segmentation Decathlon Liver dataset in the task of medical image segmentation. We find that neural network training with rigorous privacy guarantees is possible while maintaining acceptable classification performance and excellent segmentation performance. Our framework compares favourably to related work with respect to memory consumption and computational performance. Our work presents an open-source software framework for differentially private deep learning, which we demonstrate in medical imaging analysis tasks. It serves to further the utilisation of privacy-enhancing techniques in medicine and beyond in order to assist researchers and practitioners in addressing the numerous outstanding challenges towards their widespread implementation. |
format | Online Article Text |
id | pubmed-8242021 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2021 |
publisher | Nature Publishing Group UK |
record_format | MEDLINE/PubMed |
spelling | pubmed-82420212021-07-06 Medical imaging deep learning with differential privacy Ziller, Alexander Usynin, Dmitrii Braren, Rickmer Makowski, Marcus Rueckert, Daniel Kaissis, Georgios Sci Rep Article The successful training of deep learning models for diagnostic deployment in medical imaging applications requires large volumes of data. Such data cannot be procured without consideration for patient privacy, mandated both by legal regulations and ethical requirements of the medical profession. Differential privacy (DP) enables the provision of information-theoretic privacy guarantees to patients and can be implemented in the setting of deep neural network training through the differentially private stochastic gradient descent (DP-SGD) algorithm. We here present deepee, a free-and-open-source framework for differentially private deep learning for use with the PyTorch deep learning framework. Our framework is based on parallelised execution of neural network operations to obtain and modify the per-sample gradients. The process is efficiently abstracted via a data structure maintaining shared memory references to neural network weights to maintain memory efficiency. We furthermore offer specialised data loading procedures and privacy budget accounting based on the Gaussian Differential Privacy framework, as well as automated modification of the user-supplied neural network architectures to ensure DP-conformity of its layers. We benchmark our framework’s computational performance against other open-source DP frameworks and evaluate its application on the paediatric pneumonia dataset, an image classification task and on the Medical Segmentation Decathlon Liver dataset in the task of medical image segmentation. We find that neural network training with rigorous privacy guarantees is possible while maintaining acceptable classification performance and excellent segmentation performance. Our framework compares favourably to related work with respect to memory consumption and computational performance. Our work presents an open-source software framework for differentially private deep learning, which we demonstrate in medical imaging analysis tasks. It serves to further the utilisation of privacy-enhancing techniques in medicine and beyond in order to assist researchers and practitioners in addressing the numerous outstanding challenges towards their widespread implementation. Nature Publishing Group UK 2021-06-29 /pmc/articles/PMC8242021/ /pubmed/34188157 http://dx.doi.org/10.1038/s41598-021-93030-0 Text en © The Author(s) 2021 https://creativecommons.org/licenses/by/4.0/Open AccessThis article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) . |
spellingShingle | Article Ziller, Alexander Usynin, Dmitrii Braren, Rickmer Makowski, Marcus Rueckert, Daniel Kaissis, Georgios Medical imaging deep learning with differential privacy |
title | Medical imaging deep learning with differential privacy |
title_full | Medical imaging deep learning with differential privacy |
title_fullStr | Medical imaging deep learning with differential privacy |
title_full_unstemmed | Medical imaging deep learning with differential privacy |
title_short | Medical imaging deep learning with differential privacy |
title_sort | medical imaging deep learning with differential privacy |
topic | Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8242021/ https://www.ncbi.nlm.nih.gov/pubmed/34188157 http://dx.doi.org/10.1038/s41598-021-93030-0 |
work_keys_str_mv | AT zilleralexander medicalimagingdeeplearningwithdifferentialprivacy AT usynindmitrii medicalimagingdeeplearningwithdifferentialprivacy AT brarenrickmer medicalimagingdeeplearningwithdifferentialprivacy AT makowskimarcus medicalimagingdeeplearningwithdifferentialprivacy AT rueckertdaniel medicalimagingdeeplearningwithdifferentialprivacy AT kaissisgeorgios medicalimagingdeeplearningwithdifferentialprivacy |