Cargando…
TAFM: A Recommendation Algorithm Based on Text-Attention Factorization Mechanism
The click-through rate (CTR) prediction task is used to estimate the probabilities of users clicking on recommended items, which are extremely important in recommender systems. Recently, the deep factorization machine (DeepFM) algorithm was proposed. The DeepFM algorithm incorporates a factorization...
Autores principales: | , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Hindawi
2022
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9444371/ https://www.ncbi.nlm.nih.gov/pubmed/36072720 http://dx.doi.org/10.1155/2022/1775496 |
_version_ | 1784783202245672960 |
---|---|
author | Zhang, Xianrong Li, Ran Wang, Simin Li, Xintong Sun, Zhe |
author_facet | Zhang, Xianrong Li, Ran Wang, Simin Li, Xintong Sun, Zhe |
author_sort | Zhang, Xianrong |
collection | PubMed |
description | The click-through rate (CTR) prediction task is used to estimate the probabilities of users clicking on recommended items, which are extremely important in recommender systems. Recently, the deep factorization machine (DeepFM) algorithm was proposed. The DeepFM algorithm incorporates a factorization machine (FM) to learn not only low-order features but also the interactions of higher-order features. However, DeepFM lacks user diversity representations and does not consider the text. In view of this, we propose a text-attention FM (TAFM) based on the DeepFM algorithm. First, the attention mechanism in the TAFM algorithm is used to address the diverse representations of users and goods and to mine the features that are most interesting to users. Second, the TAFM model can fully learn text features through its text component, text attention component, and N-gram text feature extraction component, which can fully explore potential user preferences and the diversity among user interests. In addition, the convolutional autoencoder in the TAFM can learn some higher-level features, and the higher-order feature mining process is more comprehensive. On the public dataset, the better performing models in the existing models are deep cross network (DCN), DeepFM, and product-based neural network (PNN), respectively, and the AUC score metrics of these models hover between 0.698 and 0.699. The AUC score of our design model is 0.730, which is at least 3% higher than that of the existing models. The accuracy metric of our model is at least 0.1 percentage points higher than that of existing models. |
format | Online Article Text |
id | pubmed-9444371 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2022 |
publisher | Hindawi |
record_format | MEDLINE/PubMed |
spelling | pubmed-94443712022-09-06 TAFM: A Recommendation Algorithm Based on Text-Attention Factorization Mechanism Zhang, Xianrong Li, Ran Wang, Simin Li, Xintong Sun, Zhe Comput Intell Neurosci Research Article The click-through rate (CTR) prediction task is used to estimate the probabilities of users clicking on recommended items, which are extremely important in recommender systems. Recently, the deep factorization machine (DeepFM) algorithm was proposed. The DeepFM algorithm incorporates a factorization machine (FM) to learn not only low-order features but also the interactions of higher-order features. However, DeepFM lacks user diversity representations and does not consider the text. In view of this, we propose a text-attention FM (TAFM) based on the DeepFM algorithm. First, the attention mechanism in the TAFM algorithm is used to address the diverse representations of users and goods and to mine the features that are most interesting to users. Second, the TAFM model can fully learn text features through its text component, text attention component, and N-gram text feature extraction component, which can fully explore potential user preferences and the diversity among user interests. In addition, the convolutional autoencoder in the TAFM can learn some higher-level features, and the higher-order feature mining process is more comprehensive. On the public dataset, the better performing models in the existing models are deep cross network (DCN), DeepFM, and product-based neural network (PNN), respectively, and the AUC score metrics of these models hover between 0.698 and 0.699. The AUC score of our design model is 0.730, which is at least 3% higher than that of the existing models. The accuracy metric of our model is at least 0.1 percentage points higher than that of existing models. Hindawi 2022-08-29 /pmc/articles/PMC9444371/ /pubmed/36072720 http://dx.doi.org/10.1155/2022/1775496 Text en Copyright © 2022 Xianrong Zhang et al. https://creativecommons.org/licenses/by/4.0/This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. |
spellingShingle | Research Article Zhang, Xianrong Li, Ran Wang, Simin Li, Xintong Sun, Zhe TAFM: A Recommendation Algorithm Based on Text-Attention Factorization Mechanism |
title | TAFM: A Recommendation Algorithm Based on Text-Attention Factorization Mechanism |
title_full | TAFM: A Recommendation Algorithm Based on Text-Attention Factorization Mechanism |
title_fullStr | TAFM: A Recommendation Algorithm Based on Text-Attention Factorization Mechanism |
title_full_unstemmed | TAFM: A Recommendation Algorithm Based on Text-Attention Factorization Mechanism |
title_short | TAFM: A Recommendation Algorithm Based on Text-Attention Factorization Mechanism |
title_sort | tafm: a recommendation algorithm based on text-attention factorization mechanism |
topic | Research Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9444371/ https://www.ncbi.nlm.nih.gov/pubmed/36072720 http://dx.doi.org/10.1155/2022/1775496 |
work_keys_str_mv | AT zhangxianrong tafmarecommendationalgorithmbasedontextattentionfactorizationmechanism AT liran tafmarecommendationalgorithmbasedontextattentionfactorizationmechanism AT wangsimin tafmarecommendationalgorithmbasedontextattentionfactorizationmechanism AT lixintong tafmarecommendationalgorithmbasedontextattentionfactorizationmechanism AT sunzhe tafmarecommendationalgorithmbasedontextattentionfactorizationmechanism |