Cargando…
Joint Deep Model with Multi-Level Attention and Hybrid-Prediction for Recommendation †
The Recommender System (RS) has obtained a pivotal role in e-commerce. To improve the performance of RS, review text information has been extensively utilized. However, it is still a challenge for RS to extract the most informative feature from a tremendous amount of reviews. Another significant iss...
Autores principales: | , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
MDPI
2019
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7514625/ https://www.ncbi.nlm.nih.gov/pubmed/33266859 http://dx.doi.org/10.3390/e21020143 |
Sumario: | The Recommender System (RS) has obtained a pivotal role in e-commerce. To improve the performance of RS, review text information has been extensively utilized. However, it is still a challenge for RS to extract the most informative feature from a tremendous amount of reviews. Another significant issue is the modeling of user–item interaction, which is rarely considered to capture high- and low-order interactions simultaneously. In this paper, we design a multi-level attention mechanism to learn the usefulness of reviews and the significance of words by Deep Neural Networks (DNN). In addition, we develop a hybrid prediction structure that integrates Factorization Machine (FM) and DNN to model low-order user–item interactions as in FM and capture the high-order interactions as in DNN. Based on these two designs, we build a Multi-level Attentional and Hybrid-prediction-based Recommender (MAHR) model for recommendation. Extensive experiments on Amazon and Yelp datasets showed that our approach provides more accurate recommendations than the state-of-the-art recommendation approaches. Furthermore, the verification experiments and explainability study, including the visualization of attention modules and the review-usefulness prediction test, also validated the reasonability of our multi-level attention mechanism and hybrid prediction. |
---|