Cargando…
Evaluation of Open-Source Tools for Differential Privacy
Differential privacy (DP) defines privacy protection by promising quantified indistinguishability between individuals who consent to share their privacy-sensitive information and those who do not. DP aims to deliver this promise by including well-crafted elements of random noise in the published dat...
Autores principales: | , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
MDPI
2023
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10386022/ https://www.ncbi.nlm.nih.gov/pubmed/37514803 http://dx.doi.org/10.3390/s23146509 |
_version_ | 1785081557540667392 |
---|---|
author | Zhang, Shiliang Hagermalm, Anton Slavnic, Sanjin Schiller, Elad Michael Almgren, Magnus |
author_facet | Zhang, Shiliang Hagermalm, Anton Slavnic, Sanjin Schiller, Elad Michael Almgren, Magnus |
author_sort | Zhang, Shiliang |
collection | PubMed |
description | Differential privacy (DP) defines privacy protection by promising quantified indistinguishability between individuals who consent to share their privacy-sensitive information and those who do not. DP aims to deliver this promise by including well-crafted elements of random noise in the published data, and thus there is an inherent tradeoff between the degree of privacy protection and the ability to utilize the protected data. Currently, several open-source tools have been proposed for DP provision. To the best of our knowledge, there is no comprehensive study for comparing these open-source tools with respect to their ability to balance DP’s inherent tradeoff as well as the use of system resources. This work proposes an open-source evaluation framework for privacy protection solutions and offers evaluation for OpenDP Smartnoise, Google DP, PyTorch Opacus, Tensorflow Privacy, and Diffprivlib. In addition to studying their ability to balance the above tradeoff, we consider discrete and continuous attributes by quantifying their performance under different data sizes. Our results reveal several patterns that developers should have in mind when selecting tools under different application needs and criteria. This evaluation survey can be the basis for an improved selection of open-source DP tools and quicker adaptation of DP. |
format | Online Article Text |
id | pubmed-10386022 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2023 |
publisher | MDPI |
record_format | MEDLINE/PubMed |
spelling | pubmed-103860222023-07-30 Evaluation of Open-Source Tools for Differential Privacy Zhang, Shiliang Hagermalm, Anton Slavnic, Sanjin Schiller, Elad Michael Almgren, Magnus Sensors (Basel) Article Differential privacy (DP) defines privacy protection by promising quantified indistinguishability between individuals who consent to share their privacy-sensitive information and those who do not. DP aims to deliver this promise by including well-crafted elements of random noise in the published data, and thus there is an inherent tradeoff between the degree of privacy protection and the ability to utilize the protected data. Currently, several open-source tools have been proposed for DP provision. To the best of our knowledge, there is no comprehensive study for comparing these open-source tools with respect to their ability to balance DP’s inherent tradeoff as well as the use of system resources. This work proposes an open-source evaluation framework for privacy protection solutions and offers evaluation for OpenDP Smartnoise, Google DP, PyTorch Opacus, Tensorflow Privacy, and Diffprivlib. In addition to studying their ability to balance the above tradeoff, we consider discrete and continuous attributes by quantifying their performance under different data sizes. Our results reveal several patterns that developers should have in mind when selecting tools under different application needs and criteria. This evaluation survey can be the basis for an improved selection of open-source DP tools and quicker adaptation of DP. MDPI 2023-07-19 /pmc/articles/PMC10386022/ /pubmed/37514803 http://dx.doi.org/10.3390/s23146509 Text en © 2023 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). |
spellingShingle | Article Zhang, Shiliang Hagermalm, Anton Slavnic, Sanjin Schiller, Elad Michael Almgren, Magnus Evaluation of Open-Source Tools for Differential Privacy |
title | Evaluation of Open-Source Tools for Differential Privacy |
title_full | Evaluation of Open-Source Tools for Differential Privacy |
title_fullStr | Evaluation of Open-Source Tools for Differential Privacy |
title_full_unstemmed | Evaluation of Open-Source Tools for Differential Privacy |
title_short | Evaluation of Open-Source Tools for Differential Privacy |
title_sort | evaluation of open-source tools for differential privacy |
topic | Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10386022/ https://www.ncbi.nlm.nih.gov/pubmed/37514803 http://dx.doi.org/10.3390/s23146509 |
work_keys_str_mv | AT zhangshiliang evaluationofopensourcetoolsfordifferentialprivacy AT hagermalmanton evaluationofopensourcetoolsfordifferentialprivacy AT slavnicsanjin evaluationofopensourcetoolsfordifferentialprivacy AT schillereladmichael evaluationofopensourcetoolsfordifferentialprivacy AT almgrenmagnus evaluationofopensourcetoolsfordifferentialprivacy |