Cargando…
Deep Knowledge Tracing with Transformers
In this work, we propose a Transformer-based model to trace students’ knowledge acquisition. We modified the Transformer structure to utilize 1) the association between questions and skills and 2) the elapsed time between question steps. The use of question-skill associations allows the model to lea...
Autores principales: | , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
2020
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7334675/ http://dx.doi.org/10.1007/978-3-030-52240-7_46 |
_version_ | 1783553977126748160 |
---|---|
author | Pu, Shi Yudelson, Michael Ou, Lu Huang, Yuchi |
author_facet | Pu, Shi Yudelson, Michael Ou, Lu Huang, Yuchi |
author_sort | Pu, Shi |
collection | PubMed |
description | In this work, we propose a Transformer-based model to trace students’ knowledge acquisition. We modified the Transformer structure to utilize 1) the association between questions and skills and 2) the elapsed time between question steps. The use of question-skill associations allows the model to learn specific representation for frequently encountered questions while representing rare questions with their underline skill representations. The inclusion of elapsed time opens the opportunity to address forgetting. Our approach outperforms the state-of-the-art methods in the literature by roughly 10% in AUC with frequently used public datasets. |
format | Online Article Text |
id | pubmed-7334675 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2020 |
record_format | MEDLINE/PubMed |
spelling | pubmed-73346752020-07-06 Deep Knowledge Tracing with Transformers Pu, Shi Yudelson, Michael Ou, Lu Huang, Yuchi Artificial Intelligence in Education Article In this work, we propose a Transformer-based model to trace students’ knowledge acquisition. We modified the Transformer structure to utilize 1) the association between questions and skills and 2) the elapsed time between question steps. The use of question-skill associations allows the model to learn specific representation for frequently encountered questions while representing rare questions with their underline skill representations. The inclusion of elapsed time opens the opportunity to address forgetting. Our approach outperforms the state-of-the-art methods in the literature by roughly 10% in AUC with frequently used public datasets. 2020-06-10 /pmc/articles/PMC7334675/ http://dx.doi.org/10.1007/978-3-030-52240-7_46 Text en © Springer Nature Switzerland AG 2020 This article is made available via the PMC Open Access Subset for unrestricted research re-use and secondary analysis in any form or by any means with acknowledgement of the original source. These permissions are granted for the duration of the World Health Organization (WHO) declaration of COVID-19 as a global pandemic. |
spellingShingle | Article Pu, Shi Yudelson, Michael Ou, Lu Huang, Yuchi Deep Knowledge Tracing with Transformers |
title | Deep Knowledge Tracing with Transformers |
title_full | Deep Knowledge Tracing with Transformers |
title_fullStr | Deep Knowledge Tracing with Transformers |
title_full_unstemmed | Deep Knowledge Tracing with Transformers |
title_short | Deep Knowledge Tracing with Transformers |
title_sort | deep knowledge tracing with transformers |
topic | Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7334675/ http://dx.doi.org/10.1007/978-3-030-52240-7_46 |
work_keys_str_mv | AT pushi deepknowledgetracingwithtransformers AT yudelsonmichael deepknowledgetracingwithtransformers AT oulu deepknowledgetracingwithtransformers AT huangyuchi deepknowledgetracingwithtransformers |