Cargando…

Session interest model for CTR prediction based on self-attention mechanism

Click-through rate prediction, which aims to predict the probability of the user clicking on an item, is critical to online advertising. How to capture the user evolving interests from the user behavior sequence is an important issue in CTR prediction. However, most existing models ignore the factor...

Descripción completa

Detalles Bibliográficos
Autores principales: Wang, Qianqian, Liu, Fang’ai, Zhao, Xiaohui, Tan, Qiaoqiao
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Nature Publishing Group UK 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8741903/
https://www.ncbi.nlm.nih.gov/pubmed/34996985
http://dx.doi.org/10.1038/s41598-021-03871-y
_version_ 1784629591443243008
author Wang, Qianqian
Liu, Fang’ai
Zhao, Xiaohui
Tan, Qiaoqiao
author_facet Wang, Qianqian
Liu, Fang’ai
Zhao, Xiaohui
Tan, Qiaoqiao
author_sort Wang, Qianqian
collection PubMed
description Click-through rate prediction, which aims to predict the probability of the user clicking on an item, is critical to online advertising. How to capture the user evolving interests from the user behavior sequence is an important issue in CTR prediction. However, most existing models ignore the factor that the sequence is composed of sessions, and user behavior can be divided into different sessions according to the occurring time. The user behaviors are highly correlated in each session and are not relevant across sessions. We propose an effective model for CTR prediction, named Session Interest Model via Self-Attention (SISA). First, we divide the user sequential behavior into session layer. A self-attention mechanism with bias coding is used to model each session. Since different session interest may be related to each other or follow a sequential pattern, next, we utilize gated recurrent unit (GRU) to capture the interaction and evolution of user different historical session interests in session interest extractor module. Then, we use the local activation and GRU to aggregate their target ad to form the final representation of the behavior sequence in session interest interacting module. Experimental results show that the SISA model performs better than other models.
format Online
Article
Text
id pubmed-8741903
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher Nature Publishing Group UK
record_format MEDLINE/PubMed
spelling pubmed-87419032022-01-10 Session interest model for CTR prediction based on self-attention mechanism Wang, Qianqian Liu, Fang’ai Zhao, Xiaohui Tan, Qiaoqiao Sci Rep Article Click-through rate prediction, which aims to predict the probability of the user clicking on an item, is critical to online advertising. How to capture the user evolving interests from the user behavior sequence is an important issue in CTR prediction. However, most existing models ignore the factor that the sequence is composed of sessions, and user behavior can be divided into different sessions according to the occurring time. The user behaviors are highly correlated in each session and are not relevant across sessions. We propose an effective model for CTR prediction, named Session Interest Model via Self-Attention (SISA). First, we divide the user sequential behavior into session layer. A self-attention mechanism with bias coding is used to model each session. Since different session interest may be related to each other or follow a sequential pattern, next, we utilize gated recurrent unit (GRU) to capture the interaction and evolution of user different historical session interests in session interest extractor module. Then, we use the local activation and GRU to aggregate their target ad to form the final representation of the behavior sequence in session interest interacting module. Experimental results show that the SISA model performs better than other models. Nature Publishing Group UK 2022-01-07 /pmc/articles/PMC8741903/ /pubmed/34996985 http://dx.doi.org/10.1038/s41598-021-03871-y Text en © The Author(s) 2022 https://creativecommons.org/licenses/by/4.0/Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) .
spellingShingle Article
Wang, Qianqian
Liu, Fang’ai
Zhao, Xiaohui
Tan, Qiaoqiao
Session interest model for CTR prediction based on self-attention mechanism
title Session interest model for CTR prediction based on self-attention mechanism
title_full Session interest model for CTR prediction based on self-attention mechanism
title_fullStr Session interest model for CTR prediction based on self-attention mechanism
title_full_unstemmed Session interest model for CTR prediction based on self-attention mechanism
title_short Session interest model for CTR prediction based on self-attention mechanism
title_sort session interest model for ctr prediction based on self-attention mechanism
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8741903/
https://www.ncbi.nlm.nih.gov/pubmed/34996985
http://dx.doi.org/10.1038/s41598-021-03871-y
work_keys_str_mv AT wangqianqian sessioninterestmodelforctrpredictionbasedonselfattentionmechanism
AT liufangai sessioninterestmodelforctrpredictionbasedonselfattentionmechanism
AT zhaoxiaohui sessioninterestmodelforctrpredictionbasedonselfattentionmechanism
AT tanqiaoqiao sessioninterestmodelforctrpredictionbasedonselfattentionmechanism