Cargando…
RCKD: Response-Based Cross-Task Knowledge Distillation for Pathological Image Analysis
We propose a novel transfer learning framework for pathological image analysis, the Response-based Cross-task Knowledge Distillation (RCKD), which improves the performance of the model by pretraining it on a large unlabeled dataset guided by a high-performance teacher model. RCKD first pretrains a s...
Autores principales: | Kim, Hyunil, Kwak, Tae-Yeong, Chang, Hyeyoon, Kim, Sun Woo, Kim, Injung |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
MDPI
2023
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10669242/ https://www.ncbi.nlm.nih.gov/pubmed/38002403 http://dx.doi.org/10.3390/bioengineering10111279 |
Ejemplares similares
-
PET: Parameter-efficient Knowledge Distillation on Transformer
por: Jeon, Hyojin, et al.
Publicado: (2023) -
Explaining Neural Networks Using Attentive Knowledge Distillation
por: Lee, Hyeonseok, et al.
Publicado: (2021) -
Online knowledge distillation network for single image dehazing
por: Lan, Yunwei, et al.
Publicado: (2022) -
Cervical Cell Image Classification-Based Knowledge Distillation
por: Gao, Wenjian, et al.
Publicado: (2022) -
Biocuration: Distilling data into knowledge
Publicado: (2018)