Cargando…
A non-negative feedback self-distillation method for salient object detection
Self-distillation methods utilize Kullback-Leibler divergence (KL) loss to transfer the knowledge from the network itself, which can improve the model performance without increasing computational resources and complexity. However, when applied to salient object detection (SOD), it is difficult to ef...
Autores principales: | Chen, Lei, Cao, Tieyong, Zheng, Yunfei, Yang, Jibin, Wang, Yang, Wang, Yekui, Zhang, Bo |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
PeerJ Inc.
2023
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10319267/ https://www.ncbi.nlm.nih.gov/pubmed/37409081 http://dx.doi.org/10.7717/peerj-cs.1435 |
Ejemplares similares
-
Knowledge distillation in deep learning and its applications
por: Alkhulaifi, Abdolmaged, et al.
Publicado: (2021) -
A federated learning framework based on transfer learning and knowledge distillation for targeted advertising
por: Su, Caiyu, et al.
Publicado: (2023) -
Class incremental learning of remote sensing images based on class similarity distillation
por: Shen, Mingge, et al.
Publicado: (2023) -
Explainable AI for Data-Driven Feedback and Intelligent Action Recommendations to Support Students Self-Regulation
por: Afzaal, Muhammad, et al.
Publicado: (2021) -
A novel algorithm for small object detection based on YOLOv4
por: Wei, Jiangshu, et al.
Publicado: (2023)