Cargando…

Privacy-preserving federated neural network learning for disease-associated cell classification

Training accurate and robust machine learning models requires a large amount of data that is usually scattered across data silos. Sharing or centralizing the data of different healthcare institutions is, however, unfeasible or prohibitively difficult due to privacy regulations. In this work, we addr...

Descripción completa

Detalles Bibliográficos
Autores principales: Sav, Sinem, Bossuat, Jean-Philippe, Troncoso-Pastoriza, Juan R., Claassen, Manfred, Hubaux, Jean-Pierre
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Elsevier 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9122966/
https://www.ncbi.nlm.nih.gov/pubmed/35607628
http://dx.doi.org/10.1016/j.patter.2022.100487
_version_ 1784711460512858112
author Sav, Sinem
Bossuat, Jean-Philippe
Troncoso-Pastoriza, Juan R.
Claassen, Manfred
Hubaux, Jean-Pierre
author_facet Sav, Sinem
Bossuat, Jean-Philippe
Troncoso-Pastoriza, Juan R.
Claassen, Manfred
Hubaux, Jean-Pierre
author_sort Sav, Sinem
collection PubMed
description Training accurate and robust machine learning models requires a large amount of data that is usually scattered across data silos. Sharing or centralizing the data of different healthcare institutions is, however, unfeasible or prohibitively difficult due to privacy regulations. In this work, we address this problem by using a privacy-preserving federated learning-based approach, PriCell, for complex models such as convolutional neural networks. PriCell relies on multiparty homomorphic encryption and enables the collaborative training of encrypted neural networks with multiple healthcare institutions. We preserve the confidentiality of each institutions’ input data, of any intermediate values, and of the trained model parameters. We efficiently replicate the training of a published state-of-the-art convolutional neural network architecture in a decentralized and privacy-preserving manner. Our solution achieves an accuracy comparable with the one obtained with the centralized non-secure solution. PriCell guarantees patient privacy and ensures data utility for efficient multi-center studies involving complex healthcare data.
format Online
Article
Text
id pubmed-9122966
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher Elsevier
record_format MEDLINE/PubMed
spelling pubmed-91229662022-05-22 Privacy-preserving federated neural network learning for disease-associated cell classification Sav, Sinem Bossuat, Jean-Philippe Troncoso-Pastoriza, Juan R. Claassen, Manfred Hubaux, Jean-Pierre Patterns (N Y) Article Training accurate and robust machine learning models requires a large amount of data that is usually scattered across data silos. Sharing or centralizing the data of different healthcare institutions is, however, unfeasible or prohibitively difficult due to privacy regulations. In this work, we address this problem by using a privacy-preserving federated learning-based approach, PriCell, for complex models such as convolutional neural networks. PriCell relies on multiparty homomorphic encryption and enables the collaborative training of encrypted neural networks with multiple healthcare institutions. We preserve the confidentiality of each institutions’ input data, of any intermediate values, and of the trained model parameters. We efficiently replicate the training of a published state-of-the-art convolutional neural network architecture in a decentralized and privacy-preserving manner. Our solution achieves an accuracy comparable with the one obtained with the centralized non-secure solution. PriCell guarantees patient privacy and ensures data utility for efficient multi-center studies involving complex healthcare data. Elsevier 2022-04-18 /pmc/articles/PMC9122966/ /pubmed/35607628 http://dx.doi.org/10.1016/j.patter.2022.100487 Text en © 2022 The Authors https://creativecommons.org/licenses/by-nc-nd/4.0/This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/).
spellingShingle Article
Sav, Sinem
Bossuat, Jean-Philippe
Troncoso-Pastoriza, Juan R.
Claassen, Manfred
Hubaux, Jean-Pierre
Privacy-preserving federated neural network learning for disease-associated cell classification
title Privacy-preserving federated neural network learning for disease-associated cell classification
title_full Privacy-preserving federated neural network learning for disease-associated cell classification
title_fullStr Privacy-preserving federated neural network learning for disease-associated cell classification
title_full_unstemmed Privacy-preserving federated neural network learning for disease-associated cell classification
title_short Privacy-preserving federated neural network learning for disease-associated cell classification
title_sort privacy-preserving federated neural network learning for disease-associated cell classification
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9122966/
https://www.ncbi.nlm.nih.gov/pubmed/35607628
http://dx.doi.org/10.1016/j.patter.2022.100487
work_keys_str_mv AT savsinem privacypreservingfederatedneuralnetworklearningfordiseaseassociatedcellclassification
AT bossuatjeanphilippe privacypreservingfederatedneuralnetworklearningfordiseaseassociatedcellclassification
AT troncosopastorizajuanr privacypreservingfederatedneuralnetworklearningfordiseaseassociatedcellclassification
AT claassenmanfred privacypreservingfederatedneuralnetworklearningfordiseaseassociatedcellclassification
AT hubauxjeanpierre privacypreservingfederatedneuralnetworklearningfordiseaseassociatedcellclassification