Cargando…
Knowledge distillation based on multi-layer fusion features
Knowledge distillation improves the performance of a small student network by promoting it to learn the knowledge from a pre-trained high-performance but bulky teacher network. Generally, most of the current knowledge distillation methods extract relatively simple features from the middle or bottom...
Autores principales: | Tan, Shengyuan, Guo, Rongzuo, Tang, Jialiang, Jiang, Ning, Zou, Junying |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Public Library of Science
2023
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10461825/ https://www.ncbi.nlm.nih.gov/pubmed/37639443 http://dx.doi.org/10.1371/journal.pone.0285901 |
Ejemplares similares
-
Knowledge Fusion Distillation: Improving Distillation with Multi-scale Attention Mechanisms
por: Li, Linfeng, et al.
Publicado: (2023) -
Knowledge distillation for multi-depth-model-fusion recommendation algorithm
por: Yang, Mingbao, et al.
Publicado: (2022) -
Attention and feature transfer based knowledge distillation
por: Yang, Guoliang, et al.
Publicado: (2023) -
Author Correction: Attention and feature transfer based knowledge distillation
por: Yang, Guoliang, et al.
Publicado: (2023) -
A Novel Knowledge Distillation-Based Feature Selection for the Classification of ADHD
por: Khan, Naseer Ahmed, et al.
Publicado: (2021)