Cargando…
Scale-Hybrid Group Distillation with Knowledge Disentangling for Continual Semantic Segmentation
Continual semantic segmentation (CSS) aims to learn new tasks sequentially and extract object(s) and stuff represented by pixel-level maps of new categories while preserving the original segmentation capabilities even when the old class data is absent. Current CSS methods typically preserve the capa...
Autores principales: | Song, Zichen, Zhang, Xiaoliang, Shi, Zhaofeng |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
MDPI
2023
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10537153/ https://www.ncbi.nlm.nih.gov/pubmed/37765877 http://dx.doi.org/10.3390/s23187820 |
Ejemplares similares
-
FCKDNet: A Feature Condensation Knowledge Distillation Network for Semantic Segmentation
por: Yuan, Wenhao, et al.
Publicado: (2023) -
Knowledge Distillation for Semantic Segmentation Using Channel and Spatial Correlations and Adaptive Cross Entropy
por: Park, Sangyong, et al.
Publicado: (2020) -
Frequency Disentanglement Distillation Image Deblurring Network
por: Liu, Yiming, et al.
Publicado: (2021) -
Semantic Segmentation Using Pixel-Wise Adaptive Label Smoothing via Self-Knowledge Distillation for Limited Labeling Data
por: Park, Sangyong, et al.
Publicado: (2022) -
Knowledge Fusion Distillation: Improving Distillation with Multi-scale Attention Mechanisms
por: Li, Linfeng, et al.
Publicado: (2023)