Cargando…
Analysis of Application Examples of Differential Privacy in Deep Learning
Artificial Intelligence has been widely applied today, and the subsequent privacy leakage problems have also been paid attention to. Attacks such as model inference attacks on deep neural networks can easily extract user information from neural networks. Therefore, it is necessary to protect privacy...
Autores principales: | , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Hindawi
2021
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8564206/ https://www.ncbi.nlm.nih.gov/pubmed/34745246 http://dx.doi.org/10.1155/2021/4244040 |
_version_ | 1784593566185553920 |
---|---|
author | Shen, Zhidong Zhong, Ting |
author_facet | Shen, Zhidong Zhong, Ting |
author_sort | Shen, Zhidong |
collection | PubMed |
description | Artificial Intelligence has been widely applied today, and the subsequent privacy leakage problems have also been paid attention to. Attacks such as model inference attacks on deep neural networks can easily extract user information from neural networks. Therefore, it is necessary to protect privacy in deep learning. Differential privacy, as a popular topic in privacy-preserving in recent years, which provides rigorous privacy guarantee, can also be used to preserve privacy in deep learning. Although many articles have proposed different methods to combine differential privacy and deep learning, there are no comprehensive papers to analyze and compare the differences and connections between these technologies. For this purpose, this paper is proposed to compare different differential private methods in deep learning. We comparatively analyze and classify several deep learning models under differential privacy. Meanwhile, we also pay attention to the application of differential privacy in Generative Adversarial Networks (GANs), comparing and analyzing these models. Finally, we summarize the application of differential privacy in deep neural networks. |
format | Online Article Text |
id | pubmed-8564206 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2021 |
publisher | Hindawi |
record_format | MEDLINE/PubMed |
spelling | pubmed-85642062021-11-04 Analysis of Application Examples of Differential Privacy in Deep Learning Shen, Zhidong Zhong, Ting Comput Intell Neurosci Review Article Artificial Intelligence has been widely applied today, and the subsequent privacy leakage problems have also been paid attention to. Attacks such as model inference attacks on deep neural networks can easily extract user information from neural networks. Therefore, it is necessary to protect privacy in deep learning. Differential privacy, as a popular topic in privacy-preserving in recent years, which provides rigorous privacy guarantee, can also be used to preserve privacy in deep learning. Although many articles have proposed different methods to combine differential privacy and deep learning, there are no comprehensive papers to analyze and compare the differences and connections between these technologies. For this purpose, this paper is proposed to compare different differential private methods in deep learning. We comparatively analyze and classify several deep learning models under differential privacy. Meanwhile, we also pay attention to the application of differential privacy in Generative Adversarial Networks (GANs), comparing and analyzing these models. Finally, we summarize the application of differential privacy in deep neural networks. Hindawi 2021-10-26 /pmc/articles/PMC8564206/ /pubmed/34745246 http://dx.doi.org/10.1155/2021/4244040 Text en Copyright © 2021 Zhidong Shen and Ting Zhong. https://creativecommons.org/licenses/by/4.0/This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. |
spellingShingle | Review Article Shen, Zhidong Zhong, Ting Analysis of Application Examples of Differential Privacy in Deep Learning |
title | Analysis of Application Examples of Differential Privacy in Deep Learning |
title_full | Analysis of Application Examples of Differential Privacy in Deep Learning |
title_fullStr | Analysis of Application Examples of Differential Privacy in Deep Learning |
title_full_unstemmed | Analysis of Application Examples of Differential Privacy in Deep Learning |
title_short | Analysis of Application Examples of Differential Privacy in Deep Learning |
title_sort | analysis of application examples of differential privacy in deep learning |
topic | Review Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8564206/ https://www.ncbi.nlm.nih.gov/pubmed/34745246 http://dx.doi.org/10.1155/2021/4244040 |
work_keys_str_mv | AT shenzhidong analysisofapplicationexamplesofdifferentialprivacyindeeplearning AT zhongting analysisofapplicationexamplesofdifferentialprivacyindeeplearning |