Cargando…
Knowledge distillation for multi-depth-model-fusion recommendation algorithm
Recommendation algorithms save a lot of valuable time for people to get the information they are interested in. However, the feature calculation and extraction process of each machine learning or deep learning recommendation algorithm are different, so how to obtain various features with different d...
Autores principales: | Yang, Mingbao, Li, Shaobo, Zhou, Peng, Hu, JianJun |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Public Library of Science
2022
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9595540/ https://www.ncbi.nlm.nih.gov/pubmed/36282818 http://dx.doi.org/10.1371/journal.pone.0275955 |
Ejemplares similares
-
Knowledge Fusion Distillation: Improving Distillation with Multi-scale Attention Mechanisms
por: Li, Linfeng, et al.
Publicado: (2023) -
Knowledge distillation based on multi-layer fusion features
por: Tan, Shengyuan, et al.
Publicado: (2023) -
Knowledge distillation of multi-scale dense prediction transformer for self-supervised depth estimation
por: Song, Jimin, et al.
Publicado: (2023) -
Lightweight Depth Completion Network with Local Similarity-Preserving Knowledge Distillation
por: Jeong, Yongseop, et al.
Publicado: (2022) -
An Adaptive Fusion Algorithm for Depth Completion
por: Chen, Long, et al.
Publicado: (2022)