Cargando…
Towards formalizing the GDPR’s notion of singling out
There is a significant conceptual gap between legal and mathematical thinking around data privacy. The effect is uncertainty as to which technical offerings meet legal standards. This uncertainty is exacerbated by a litany of successful privacy attacks demonstrating that traditional statistical disc...
Autores principales: | , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
National Academy of Sciences
2020
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7165454/ https://www.ncbi.nlm.nih.gov/pubmed/32234789 http://dx.doi.org/10.1073/pnas.1914598117 |
_version_ | 1783523477364408320 |
---|---|
author | Cohen, Aloni Nissim, Kobbi |
author_facet | Cohen, Aloni Nissim, Kobbi |
author_sort | Cohen, Aloni |
collection | PubMed |
description | There is a significant conceptual gap between legal and mathematical thinking around data privacy. The effect is uncertainty as to which technical offerings meet legal standards. This uncertainty is exacerbated by a litany of successful privacy attacks demonstrating that traditional statistical disclosure limitation techniques often fall short of the privacy envisioned by regulators. We define “predicate singling out,” a type of privacy attack intended to capture the concept of singling out appearing in the General Data Protection Regulation (GDPR). An adversary predicate singles out a dataset [Formula: see text] using the output of a data-release mechanism [Formula: see text] if it finds a predicate [Formula: see text] matching exactly one row in [Formula: see text] with probability much better than a statistical baseline. A data-release mechanism that precludes such attacks is “secure against predicate singling out” (PSO secure). We argue that PSO security is a mathematical concept with legal consequences. Any data-release mechanism that purports to “render anonymous” personal data under the GDPR must prevent singling out and, hence, must be PSO secure. We analyze the properties of PSO security, showing that it fails to compose. Namely, a combination of more than logarithmically many exact counts, each individually PSO secure, facilitates predicate singling out. Finally, we ask whether differential privacy and [Formula: see text]-anonymity are PSO secure. Leveraging a connection to statistical generalization, we show that differential privacy implies PSO security. However, and in contrast with current legal guidance, [Formula: see text]-anonymity does not: There exists a simple predicate singling out attack under mild assumptions on the [Formula: see text]-anonymizer and the data distribution. |
format | Online Article Text |
id | pubmed-7165454 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2020 |
publisher | National Academy of Sciences |
record_format | MEDLINE/PubMed |
spelling | pubmed-71654542020-04-23 Towards formalizing the GDPR’s notion of singling out Cohen, Aloni Nissim, Kobbi Proc Natl Acad Sci U S A Physical Sciences There is a significant conceptual gap between legal and mathematical thinking around data privacy. The effect is uncertainty as to which technical offerings meet legal standards. This uncertainty is exacerbated by a litany of successful privacy attacks demonstrating that traditional statistical disclosure limitation techniques often fall short of the privacy envisioned by regulators. We define “predicate singling out,” a type of privacy attack intended to capture the concept of singling out appearing in the General Data Protection Regulation (GDPR). An adversary predicate singles out a dataset [Formula: see text] using the output of a data-release mechanism [Formula: see text] if it finds a predicate [Formula: see text] matching exactly one row in [Formula: see text] with probability much better than a statistical baseline. A data-release mechanism that precludes such attacks is “secure against predicate singling out” (PSO secure). We argue that PSO security is a mathematical concept with legal consequences. Any data-release mechanism that purports to “render anonymous” personal data under the GDPR must prevent singling out and, hence, must be PSO secure. We analyze the properties of PSO security, showing that it fails to compose. Namely, a combination of more than logarithmically many exact counts, each individually PSO secure, facilitates predicate singling out. Finally, we ask whether differential privacy and [Formula: see text]-anonymity are PSO secure. Leveraging a connection to statistical generalization, we show that differential privacy implies PSO security. However, and in contrast with current legal guidance, [Formula: see text]-anonymity does not: There exists a simple predicate singling out attack under mild assumptions on the [Formula: see text]-anonymizer and the data distribution. National Academy of Sciences 2020-04-14 2020-03-31 /pmc/articles/PMC7165454/ /pubmed/32234789 http://dx.doi.org/10.1073/pnas.1914598117 Text en Copyright © 2020 the Author(s). Published by PNAS. https://creativecommons.org/licenses/by-nc-nd/4.0/ https://creativecommons.org/licenses/by-nc-nd/4.0/This open access article is distributed under Creative Commons Attribution-NonCommercial-NoDerivatives License 4.0 (CC BY-NC-ND) (https://creativecommons.org/licenses/by-nc-nd/4.0/) . |
spellingShingle | Physical Sciences Cohen, Aloni Nissim, Kobbi Towards formalizing the GDPR’s notion of singling out |
title | Towards formalizing the GDPR’s notion of singling out |
title_full | Towards formalizing the GDPR’s notion of singling out |
title_fullStr | Towards formalizing the GDPR’s notion of singling out |
title_full_unstemmed | Towards formalizing the GDPR’s notion of singling out |
title_short | Towards formalizing the GDPR’s notion of singling out |
title_sort | towards formalizing the gdpr’s notion of singling out |
topic | Physical Sciences |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7165454/ https://www.ncbi.nlm.nih.gov/pubmed/32234789 http://dx.doi.org/10.1073/pnas.1914598117 |
work_keys_str_mv | AT cohenaloni towardsformalizingthegdprsnotionofsinglingout AT nissimkobbi towardsformalizingthegdprsnotionofsinglingout |