Cargando…

Investigating Transformers for Automatic Short Answer Grading

Recent advancements in the field of deep learning for natural language processing made it possible to use novel deep learning architectures, such as the Transformer, for increasingly complex natural language processing tasks. Combined with novel unsupervised pre-training tasks such as masked languag...

Descripción completa

Detalles Bibliográficos
Autores principales: Camus, Leon, Filighera, Anna
Formato: Online Artículo Texto
Lenguaje:English
Publicado: 2020
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7334688/
http://dx.doi.org/10.1007/978-3-030-52240-7_8
_version_ 1783553980205367296
author Camus, Leon
Filighera, Anna
author_facet Camus, Leon
Filighera, Anna
author_sort Camus, Leon
collection PubMed
description Recent advancements in the field of deep learning for natural language processing made it possible to use novel deep learning architectures, such as the Transformer, for increasingly complex natural language processing tasks. Combined with novel unsupervised pre-training tasks such as masked language modeling, sentence ordering or next sentence prediction, those natural language processing models became even more accurate. In this work, we experiment with fine-tuning different pre-trained Transformer based architectures. We train the newest and most powerful, according to the glue benchmark, transformers on the SemEval-2013 dataset. We also explore the impact of transfer learning a model fine-tuned on the MNLI dataset to the SemEval-2013 dataset on generalization and performance. We report up to 13% absolute improvement in macro-average-F1 over state-of-the-art results. We show that models trained with knowledge distillation are feasible for use in short answer grading. Furthermore, we compare multilingual models on a machine-translated version of the SemEval-2013 dataset.
format Online
Article
Text
id pubmed-7334688
institution National Center for Biotechnology Information
language English
publishDate 2020
record_format MEDLINE/PubMed
spelling pubmed-73346882020-07-06 Investigating Transformers for Automatic Short Answer Grading Camus, Leon Filighera, Anna Artificial Intelligence in Education Article Recent advancements in the field of deep learning for natural language processing made it possible to use novel deep learning architectures, such as the Transformer, for increasingly complex natural language processing tasks. Combined with novel unsupervised pre-training tasks such as masked language modeling, sentence ordering or next sentence prediction, those natural language processing models became even more accurate. In this work, we experiment with fine-tuning different pre-trained Transformer based architectures. We train the newest and most powerful, according to the glue benchmark, transformers on the SemEval-2013 dataset. We also explore the impact of transfer learning a model fine-tuned on the MNLI dataset to the SemEval-2013 dataset on generalization and performance. We report up to 13% absolute improvement in macro-average-F1 over state-of-the-art results. We show that models trained with knowledge distillation are feasible for use in short answer grading. Furthermore, we compare multilingual models on a machine-translated version of the SemEval-2013 dataset. 2020-06-10 /pmc/articles/PMC7334688/ http://dx.doi.org/10.1007/978-3-030-52240-7_8 Text en © Springer Nature Switzerland AG 2020 This article is made available via the PMC Open Access Subset for unrestricted research re-use and secondary analysis in any form or by any means with acknowledgement of the original source. These permissions are granted for the duration of the World Health Organization (WHO) declaration of COVID-19 as a global pandemic.
spellingShingle Article
Camus, Leon
Filighera, Anna
Investigating Transformers for Automatic Short Answer Grading
title Investigating Transformers for Automatic Short Answer Grading
title_full Investigating Transformers for Automatic Short Answer Grading
title_fullStr Investigating Transformers for Automatic Short Answer Grading
title_full_unstemmed Investigating Transformers for Automatic Short Answer Grading
title_short Investigating Transformers for Automatic Short Answer Grading
title_sort investigating transformers for automatic short answer grading
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7334688/
http://dx.doi.org/10.1007/978-3-030-52240-7_8
work_keys_str_mv AT camusleon investigatingtransformersforautomaticshortanswergrading
AT filigheraanna investigatingtransformersforautomaticshortanswergrading