Cargando…

Scale-Hybrid Group Distillation with Knowledge Disentangling for Continual Semantic Segmentation

Continual semantic segmentation (CSS) aims to learn new tasks sequentially and extract object(s) and stuff represented by pixel-level maps of new categories while preserving the original segmentation capabilities even when the old class data is absent. Current CSS methods typically preserve the capa...

Descripción completa

Detalles Bibliográficos
Autores principales: Song, Zichen, Zhang, Xiaoliang, Shi, Zhaofeng
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10537153/
https://www.ncbi.nlm.nih.gov/pubmed/37765877
http://dx.doi.org/10.3390/s23187820
_version_ 1785113036340592640
author Song, Zichen
Zhang, Xiaoliang
Shi, Zhaofeng
author_facet Song, Zichen
Zhang, Xiaoliang
Shi, Zhaofeng
author_sort Song, Zichen
collection PubMed
description Continual semantic segmentation (CSS) aims to learn new tasks sequentially and extract object(s) and stuff represented by pixel-level maps of new categories while preserving the original segmentation capabilities even when the old class data is absent. Current CSS methods typically preserve the capacities of segmenting old classes via knowledge distillation, which encounters the limitations of insufficient utilization of the semantic knowledge, i.e., only distilling the last layer of the feature encoder, and the semantic shift of background caused by directly distilling the entire feature map of the decoder. In this paper, we propose a novel CCS method based on scale-hybrid distillation and knowledge disentangling to address these limitations. Firstly, we propose a scale-hybrid group semantic distillation (SGD) method for encoding, which transfers the multi-scale knowledge from the old model’s feature encoder with group pooling refinement to improve the stability of new models. Then, the knowledge disentangling distillation (KDD) method for decoding is proposed to distillate feature maps with the guidance of the old class regions and reduce incorrect guides from old models towards better plasticity. Extensive experiments are conducted on the Pascal VOC and ADE20K datasets. Competitive performance compared with other state-of-the-art methods demonstrates the effectiveness of our proposed method.
format Online
Article
Text
id pubmed-10537153
institution National Center for Biotechnology Information
language English
publishDate 2023
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-105371532023-09-29 Scale-Hybrid Group Distillation with Knowledge Disentangling for Continual Semantic Segmentation Song, Zichen Zhang, Xiaoliang Shi, Zhaofeng Sensors (Basel) Article Continual semantic segmentation (CSS) aims to learn new tasks sequentially and extract object(s) and stuff represented by pixel-level maps of new categories while preserving the original segmentation capabilities even when the old class data is absent. Current CSS methods typically preserve the capacities of segmenting old classes via knowledge distillation, which encounters the limitations of insufficient utilization of the semantic knowledge, i.e., only distilling the last layer of the feature encoder, and the semantic shift of background caused by directly distilling the entire feature map of the decoder. In this paper, we propose a novel CCS method based on scale-hybrid distillation and knowledge disentangling to address these limitations. Firstly, we propose a scale-hybrid group semantic distillation (SGD) method for encoding, which transfers the multi-scale knowledge from the old model’s feature encoder with group pooling refinement to improve the stability of new models. Then, the knowledge disentangling distillation (KDD) method for decoding is proposed to distillate feature maps with the guidance of the old class regions and reduce incorrect guides from old models towards better plasticity. Extensive experiments are conducted on the Pascal VOC and ADE20K datasets. Competitive performance compared with other state-of-the-art methods demonstrates the effectiveness of our proposed method. MDPI 2023-09-12 /pmc/articles/PMC10537153/ /pubmed/37765877 http://dx.doi.org/10.3390/s23187820 Text en © 2023 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Song, Zichen
Zhang, Xiaoliang
Shi, Zhaofeng
Scale-Hybrid Group Distillation with Knowledge Disentangling for Continual Semantic Segmentation
title Scale-Hybrid Group Distillation with Knowledge Disentangling for Continual Semantic Segmentation
title_full Scale-Hybrid Group Distillation with Knowledge Disentangling for Continual Semantic Segmentation
title_fullStr Scale-Hybrid Group Distillation with Knowledge Disentangling for Continual Semantic Segmentation
title_full_unstemmed Scale-Hybrid Group Distillation with Knowledge Disentangling for Continual Semantic Segmentation
title_short Scale-Hybrid Group Distillation with Knowledge Disentangling for Continual Semantic Segmentation
title_sort scale-hybrid group distillation with knowledge disentangling for continual semantic segmentation
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10537153/
https://www.ncbi.nlm.nih.gov/pubmed/37765877
http://dx.doi.org/10.3390/s23187820
work_keys_str_mv AT songzichen scalehybridgroupdistillationwithknowledgedisentanglingforcontinualsemanticsegmentation
AT zhangxiaoliang scalehybridgroupdistillationwithknowledgedisentanglingforcontinualsemanticsegmentation
AT shizhaofeng scalehybridgroupdistillationwithknowledgedisentanglingforcontinualsemanticsegmentation