Cargando…
Knowledge Fusion Distillation: Improving Distillation with Multi-scale Attention Mechanisms
The success of deep learning has brought breakthroughs in many fields. However, the increased performance of deep learning models is often accompanied by an increase in their depth and width, which conflicts with the storage, energy consumption, and computational power of edge devices. Knowledge dis...
Autores principales: | Li, Linfeng, Su, Weixing, Liu, Fang, He, Maowei, Liang, Xiaodan |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Springer US
2023
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9807430/ https://www.ncbi.nlm.nih.gov/pubmed/36619739 http://dx.doi.org/10.1007/s11063-022-11132-w |
Ejemplares similares
-
Knowledge distillation based on multi-layer fusion features
por: Tan, Shengyuan, et al.
Publicado: (2023) -
Knowledge distillation for multi-depth-model-fusion recommendation algorithm
por: Yang, Mingbao, et al.
Publicado: (2022) -
Attention and feature transfer based knowledge distillation
por: Yang, Guoliang, et al.
Publicado: (2023) -
Explaining Neural Networks Using Attentive Knowledge Distillation
por: Lee, Hyeonseok, et al.
Publicado: (2021) -
Super-Resolution Network with Information Distillation and Multi-Scale Attention for Medical CT Image
por: Zhao, Tianliu, et al.
Publicado: (2021)