Cargando…

Attention and feature transfer based knowledge distillation

Existing knowledge distillation (KD) methods are mainly based on features, logic, or attention, where features and logic represent the results of reasoning at different stages of a convolutional neural network, and attention maps symbolize the reasoning process. Because of the continuity of the two...

Descripción completa

Detalles Bibliográficos
Autores principales: Yang, Guoliang, Yu, Shuaiying, Sheng, Yangyang, Yang, Hao
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Nature Publishing Group UK 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10603170/
https://www.ncbi.nlm.nih.gov/pubmed/37884556
http://dx.doi.org/10.1038/s41598-023-43986-y

Ejemplares similares