Cargando…

Knowledge distillation based on multi-layer fusion features

Knowledge distillation improves the performance of a small student network by promoting it to learn the knowledge from a pre-trained high-performance but bulky teacher network. Generally, most of the current knowledge distillation methods extract relatively simple features from the middle or bottom...

Descripción completa

Detalles Bibliográficos
Autores principales: Tan, Shengyuan, Guo, Rongzuo, Tang, Jialiang, Jiang, Ning, Zou, Junying
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Public Library of Science 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10461825/
https://www.ncbi.nlm.nih.gov/pubmed/37639443
http://dx.doi.org/10.1371/journal.pone.0285901