Cargando…
Knowledge distillation in deep learning and its applications
Deep learning based models are relatively large, and it is hard to deploy such models on resource-limited devices such as mobile phones and embedded devices. One possible solution is knowledge distillation whereby a smaller model (student model) is trained by utilizing the information from a larger...
Autores principales: | Alkhulaifi, Abdolmaged, Alsahli, Fahad, Ahmad, Irfan |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
PeerJ Inc.
2021
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8053015/ https://www.ncbi.nlm.nih.gov/pubmed/33954248 http://dx.doi.org/10.7717/peerj-cs.474 |
Ejemplares similares
-
A federated learning framework based on transfer learning and knowledge distillation for targeted advertising
por: Su, Caiyu, et al.
Publicado: (2023) -
Class incremental learning of remote sensing images based on class similarity distillation
por: Shen, Mingge, et al.
Publicado: (2023) -
Deep Learning and its Application for Healthcare Delivery in Low and Middle Income Countries
por: Williams, Douglas, et al.
Publicado: (2021) -
Medical Application of Geometric Deep Learning for the Diagnosis of Glaucoma
por: Thiéry, Alexandre H., et al.
Publicado: (2023) -
A non-negative feedback self-distillation method for salient object detection
por: Chen, Lei, et al.
Publicado: (2023)