Cargando…
Knowledge distillation in deep learning and its applications
Deep learning based models are relatively large, and it is hard to deploy such models on resource-limited devices such as mobile phones and embedded devices. One possible solution is knowledge distillation whereby a smaller model (student model) is trained by utilizing the information from a larger...
Autores principales: | , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
PeerJ Inc.
2021
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8053015/ https://www.ncbi.nlm.nih.gov/pubmed/33954248 http://dx.doi.org/10.7717/peerj-cs.474 |
_version_ | 1783680033638842368 |
---|---|
author | Alkhulaifi, Abdolmaged Alsahli, Fahad Ahmad, Irfan |
author_facet | Alkhulaifi, Abdolmaged Alsahli, Fahad Ahmad, Irfan |
author_sort | Alkhulaifi, Abdolmaged |
collection | PubMed |
description | Deep learning based models are relatively large, and it is hard to deploy such models on resource-limited devices such as mobile phones and embedded devices. One possible solution is knowledge distillation whereby a smaller model (student model) is trained by utilizing the information from a larger model (teacher model). In this paper, we present an outlook of knowledge distillation techniques applied to deep learning models. To compare the performances of different techniques, we propose a new metric called distillation metric which compares different knowledge distillation solutions based on models' sizes and accuracy scores. Based on the survey, some interesting conclusions are drawn and presented in this paper including the current challenges and possible research directions. |
format | Online Article Text |
id | pubmed-8053015 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2021 |
publisher | PeerJ Inc. |
record_format | MEDLINE/PubMed |
spelling | pubmed-80530152021-05-04 Knowledge distillation in deep learning and its applications Alkhulaifi, Abdolmaged Alsahli, Fahad Ahmad, Irfan PeerJ Comput Sci Artificial Intelligence Deep learning based models are relatively large, and it is hard to deploy such models on resource-limited devices such as mobile phones and embedded devices. One possible solution is knowledge distillation whereby a smaller model (student model) is trained by utilizing the information from a larger model (teacher model). In this paper, we present an outlook of knowledge distillation techniques applied to deep learning models. To compare the performances of different techniques, we propose a new metric called distillation metric which compares different knowledge distillation solutions based on models' sizes and accuracy scores. Based on the survey, some interesting conclusions are drawn and presented in this paper including the current challenges and possible research directions. PeerJ Inc. 2021-04-14 /pmc/articles/PMC8053015/ /pubmed/33954248 http://dx.doi.org/10.7717/peerj-cs.474 Text en © 2021 Alkhulaifi et al. https://creativecommons.org/licenses/by/4.0/This is an open access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/) , which permits unrestricted use, distribution, reproduction and adaptation in any medium and for any purpose provided that it is properly attributed. For attribution, the original author(s), title, publication source (PeerJ Computer Science) and either DOI or URL of the article must be cited. |
spellingShingle | Artificial Intelligence Alkhulaifi, Abdolmaged Alsahli, Fahad Ahmad, Irfan Knowledge distillation in deep learning and its applications |
title | Knowledge distillation in deep learning and its applications |
title_full | Knowledge distillation in deep learning and its applications |
title_fullStr | Knowledge distillation in deep learning and its applications |
title_full_unstemmed | Knowledge distillation in deep learning and its applications |
title_short | Knowledge distillation in deep learning and its applications |
title_sort | knowledge distillation in deep learning and its applications |
topic | Artificial Intelligence |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8053015/ https://www.ncbi.nlm.nih.gov/pubmed/33954248 http://dx.doi.org/10.7717/peerj-cs.474 |
work_keys_str_mv | AT alkhulaifiabdolmaged knowledgedistillationindeeplearninganditsapplications AT alsahlifahad knowledgedistillationindeeplearninganditsapplications AT ahmadirfan knowledgedistillationindeeplearninganditsapplications |