Cargando…
Knowledge distillation for multi-depth-model-fusion recommendation algorithm
Recommendation algorithms save a lot of valuable time for people to get the information they are interested in. However, the feature calculation and extraction process of each machine learning or deep learning recommendation algorithm are different, so how to obtain various features with different d...
Autores principales: | , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Public Library of Science
2022
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9595540/ https://www.ncbi.nlm.nih.gov/pubmed/36282818 http://dx.doi.org/10.1371/journal.pone.0275955 |
_version_ | 1784815675593719808 |
---|---|
author | Yang, Mingbao Li, Shaobo Zhou, Peng Hu, JianJun |
author_facet | Yang, Mingbao Li, Shaobo Zhou, Peng Hu, JianJun |
author_sort | Yang, Mingbao |
collection | PubMed |
description | Recommendation algorithms save a lot of valuable time for people to get the information they are interested in. However, the feature calculation and extraction process of each machine learning or deep learning recommendation algorithm are different, so how to obtain various features with different dimensions, i.e., how to integrate the advantages of each model and improve the model inference efficiency, becomes the focus of this paper. In this paper, a better deep learning model is obtained by integrating several cutting-edge deep learning models. Meanwhile, to make the integrated learning model converge better and faster, the parameters of the integrated module are initialized, constraints are imposed, and a new activation function is designed for better integration of the sub-models. Finally, the integrated large model is distilled for knowledge distillation, which greatly reduces the number of model parameters and improves the model inference efficiency. |
format | Online Article Text |
id | pubmed-9595540 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2022 |
publisher | Public Library of Science |
record_format | MEDLINE/PubMed |
spelling | pubmed-95955402022-10-26 Knowledge distillation for multi-depth-model-fusion recommendation algorithm Yang, Mingbao Li, Shaobo Zhou, Peng Hu, JianJun PLoS One Research Article Recommendation algorithms save a lot of valuable time for people to get the information they are interested in. However, the feature calculation and extraction process of each machine learning or deep learning recommendation algorithm are different, so how to obtain various features with different dimensions, i.e., how to integrate the advantages of each model and improve the model inference efficiency, becomes the focus of this paper. In this paper, a better deep learning model is obtained by integrating several cutting-edge deep learning models. Meanwhile, to make the integrated learning model converge better and faster, the parameters of the integrated module are initialized, constraints are imposed, and a new activation function is designed for better integration of the sub-models. Finally, the integrated large model is distilled for knowledge distillation, which greatly reduces the number of model parameters and improves the model inference efficiency. Public Library of Science 2022-10-25 /pmc/articles/PMC9595540/ /pubmed/36282818 http://dx.doi.org/10.1371/journal.pone.0275955 Text en © 2022 Yang et al https://creativecommons.org/licenses/by/4.0/This is an open access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/) , which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited. |
spellingShingle | Research Article Yang, Mingbao Li, Shaobo Zhou, Peng Hu, JianJun Knowledge distillation for multi-depth-model-fusion recommendation algorithm |
title | Knowledge distillation for multi-depth-model-fusion recommendation algorithm |
title_full | Knowledge distillation for multi-depth-model-fusion recommendation algorithm |
title_fullStr | Knowledge distillation for multi-depth-model-fusion recommendation algorithm |
title_full_unstemmed | Knowledge distillation for multi-depth-model-fusion recommendation algorithm |
title_short | Knowledge distillation for multi-depth-model-fusion recommendation algorithm |
title_sort | knowledge distillation for multi-depth-model-fusion recommendation algorithm |
topic | Research Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9595540/ https://www.ncbi.nlm.nih.gov/pubmed/36282818 http://dx.doi.org/10.1371/journal.pone.0275955 |
work_keys_str_mv | AT yangmingbao knowledgedistillationformultidepthmodelfusionrecommendationalgorithm AT lishaobo knowledgedistillationformultidepthmodelfusionrecommendationalgorithm AT zhoupeng knowledgedistillationformultidepthmodelfusionrecommendationalgorithm AT hujianjun knowledgedistillationformultidepthmodelfusionrecommendationalgorithm |