Cargando…

Knowledge Fusion Distillation: Improving Distillation with Multi-scale Attention Mechanisms

The success of deep learning has brought breakthroughs in many fields. However, the increased performance of deep learning models is often accompanied by an increase in their depth and width, which conflicts with the storage, energy consumption, and computational power of edge devices. Knowledge dis...

Descripción completa

Detalles Bibliográficos
Autores principales: Li, Linfeng, Su, Weixing, Liu, Fang, He, Maowei, Liang, Xiaodan
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Springer US 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9807430/
https://www.ncbi.nlm.nih.gov/pubmed/36619739
http://dx.doi.org/10.1007/s11063-022-11132-w