Cargando…
Attention and feature transfer based knowledge distillation
Existing knowledge distillation (KD) methods are mainly based on features, logic, or attention, where features and logic represent the results of reasoning at different stages of a convolutional neural network, and attention maps symbolize the reasoning process. Because of the continuity of the two...
Autores principales: | Yang, Guoliang, Yu, Shuaiying, Sheng, Yangyang, Yang, Hao |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Nature Publishing Group UK
2023
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10603170/ https://www.ncbi.nlm.nih.gov/pubmed/37884556 http://dx.doi.org/10.1038/s41598-023-43986-y |
Ejemplares similares
-
Author Correction: Attention and feature transfer based knowledge distillation
por: Yang, Guoliang, et al.
Publicado: (2023) -
Knowledge Fusion Distillation: Improving Distillation with Multi-scale Attention Mechanisms
por: Li, Linfeng, et al.
Publicado: (2023) -
Explaining Neural Networks Using Attentive Knowledge Distillation
por: Lee, Hyeonseok, et al.
Publicado: (2021) -
Knowledge distillation based on multi-layer fusion features
por: Tan, Shengyuan, et al.
Publicado: (2023) -
A Fine-Grained Bird Classification Method Based on Attention and Decoupled Knowledge Distillation
por: Wang, Kang, et al.
Publicado: (2023)