Cargando…
Explaining Neural Networks Using Attentive Knowledge Distillation
Explaining the prediction of deep neural networks makes the networks more understandable and trusted, leading to their use in various mission critical tasks. Recent progress in the learning capability of networks has primarily been due to the enormous number of model parameters, so that it is usuall...
Autores principales: | Lee, Hyeonseok, Kim, Sungchan |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
MDPI
2021
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7916876/ https://www.ncbi.nlm.nih.gov/pubmed/33670125 http://dx.doi.org/10.3390/s21041280 |
Ejemplares similares
-
Knowledge Fusion Distillation: Improving Distillation with Multi-scale Attention Mechanisms
por: Li, Linfeng, et al.
Publicado: (2023) -
Attention and feature transfer based knowledge distillation
por: Yang, Guoliang, et al.
Publicado: (2023) -
Attention Network with Information Distillation for Super-Resolution
por: Zang, Huaijuan, et al.
Publicado: (2022) -
Knowledge distillation with ensembles of convolutional neural networks for medical image segmentation
por: Noothout, Julia M. H., et al.
Publicado: (2022) -
Author Correction: Attention and feature transfer based knowledge distillation
por: Yang, Guoliang, et al.
Publicado: (2023)