Cargando…

Transformer-based convolutional forgetting knowledge tracking

Knowledge tracking is to analyze the mastery of students' knowledge through the learning track. This is very important for online education, since it can determine a learner’s current knowledge level by analyzing the learning history and then make recommendations for future learning. In the pas...

Descripción completa

Detalles Bibliográficos
Autores principales: Liu, Tieyuan, Zhang, Meng, Zhu, Chuangying, Chang, Liang
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Nature Publishing Group UK 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10625530/
https://www.ncbi.nlm.nih.gov/pubmed/37925491
http://dx.doi.org/10.1038/s41598-023-45936-0
_version_ 1785131151013183488
author Liu, Tieyuan
Zhang, Meng
Zhu, Chuangying
Chang, Liang
author_facet Liu, Tieyuan
Zhang, Meng
Zhu, Chuangying
Chang, Liang
author_sort Liu, Tieyuan
collection PubMed
description Knowledge tracking is to analyze the mastery of students' knowledge through the learning track. This is very important for online education, since it can determine a learner’s current knowledge level by analyzing the learning history and then make recommendations for future learning. In the past, the commonly used model for knowledge tracking is the convolutional neural network, but it has long-term sequence dependencies. With the invention of Transformer, it has excellent performance in long-sequence modeling by virtue of the attention mechanism, and is gradually introduced into the field of knowledge tracking. However, through our research, some knowledge tracking data sets have a large number of continuous and repetitive training, which will cause Transformer model to ignore the potential connections between some knowledge points. To overcome this problem, we introduce a convolutional attention mechanism to help the model perceive contextual information better. In addition, we simulate the forgetting phenomenon of students during the learning process by calculating the forgetting factor, and fuse it with the weight matrix generated by the model to improve the accuracy of the model. As a result, a Transformer-based Convolutional Forgetting Knowledge Tracking (TCFKT) model is presented in this paper. According to the experimental results conducted on the real world ASSITments2012, ASSISTments2017, KDD a, STATIC datasets, the TCFKT model outperforms other knowledge tracking models.
format Online
Article
Text
id pubmed-10625530
institution National Center for Biotechnology Information
language English
publishDate 2023
publisher Nature Publishing Group UK
record_format MEDLINE/PubMed
spelling pubmed-106255302023-11-06 Transformer-based convolutional forgetting knowledge tracking Liu, Tieyuan Zhang, Meng Zhu, Chuangying Chang, Liang Sci Rep Article Knowledge tracking is to analyze the mastery of students' knowledge through the learning track. This is very important for online education, since it can determine a learner’s current knowledge level by analyzing the learning history and then make recommendations for future learning. In the past, the commonly used model for knowledge tracking is the convolutional neural network, but it has long-term sequence dependencies. With the invention of Transformer, it has excellent performance in long-sequence modeling by virtue of the attention mechanism, and is gradually introduced into the field of knowledge tracking. However, through our research, some knowledge tracking data sets have a large number of continuous and repetitive training, which will cause Transformer model to ignore the potential connections between some knowledge points. To overcome this problem, we introduce a convolutional attention mechanism to help the model perceive contextual information better. In addition, we simulate the forgetting phenomenon of students during the learning process by calculating the forgetting factor, and fuse it with the weight matrix generated by the model to improve the accuracy of the model. As a result, a Transformer-based Convolutional Forgetting Knowledge Tracking (TCFKT) model is presented in this paper. According to the experimental results conducted on the real world ASSITments2012, ASSISTments2017, KDD a, STATIC datasets, the TCFKT model outperforms other knowledge tracking models. Nature Publishing Group UK 2023-11-04 /pmc/articles/PMC10625530/ /pubmed/37925491 http://dx.doi.org/10.1038/s41598-023-45936-0 Text en © The Author(s) 2023 https://creativecommons.org/licenses/by/4.0/Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) .
spellingShingle Article
Liu, Tieyuan
Zhang, Meng
Zhu, Chuangying
Chang, Liang
Transformer-based convolutional forgetting knowledge tracking
title Transformer-based convolutional forgetting knowledge tracking
title_full Transformer-based convolutional forgetting knowledge tracking
title_fullStr Transformer-based convolutional forgetting knowledge tracking
title_full_unstemmed Transformer-based convolutional forgetting knowledge tracking
title_short Transformer-based convolutional forgetting knowledge tracking
title_sort transformer-based convolutional forgetting knowledge tracking
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10625530/
https://www.ncbi.nlm.nih.gov/pubmed/37925491
http://dx.doi.org/10.1038/s41598-023-45936-0
work_keys_str_mv AT liutieyuan transformerbasedconvolutionalforgettingknowledgetracking
AT zhangmeng transformerbasedconvolutionalforgettingknowledgetracking
AT zhuchuangying transformerbasedconvolutionalforgettingknowledgetracking
AT changliang transformerbasedconvolutionalforgettingknowledgetracking