Cargando…
A non-negative feedback self-distillation method for salient object detection
Self-distillation methods utilize Kullback-Leibler divergence (KL) loss to transfer the knowledge from the network itself, which can improve the model performance without increasing computational resources and complexity. However, when applied to salient object detection (SOD), it is difficult to ef...
Autores principales: | , , , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
PeerJ Inc.
2023
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10319267/ https://www.ncbi.nlm.nih.gov/pubmed/37409081 http://dx.doi.org/10.7717/peerj-cs.1435 |