Cargando…
A pre-training and self-training approach for biomedical named entity recognition
Named entity recognition (NER) is a key component of many scientific literature mining tasks, such as information retrieval, information extraction, and question answering; however, many modern approaches require large amounts of labeled training data in order to be effective. This severely limits t...
Autores principales: | , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Public Library of Science
2021
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7872256/ https://www.ncbi.nlm.nih.gov/pubmed/33561139 http://dx.doi.org/10.1371/journal.pone.0246310 |
_version_ | 1783649152948764672 |
---|---|
author | Gao, Shang Kotevska, Olivera Sorokine, Alexandre Christian, J. Blair |
author_facet | Gao, Shang Kotevska, Olivera Sorokine, Alexandre Christian, J. Blair |
author_sort | Gao, Shang |
collection | PubMed |
description | Named entity recognition (NER) is a key component of many scientific literature mining tasks, such as information retrieval, information extraction, and question answering; however, many modern approaches require large amounts of labeled training data in order to be effective. This severely limits the effectiveness of NER models in applications where expert annotations are difficult and expensive to obtain. In this work, we explore the effectiveness of transfer learning and semi-supervised self-training to improve the performance of NER models in biomedical settings with very limited labeled data (250-2000 labeled samples). We first pre-train a BiLSTM-CRF and a BERT model on a very large general biomedical NER corpus such as MedMentions or Semantic Medline, and then we fine-tune the model on a more specific target NER task that has very limited training data; finally, we apply semi-supervised self-training using unlabeled data to further boost model performance. We show that in NER tasks that focus on common biomedical entity types such as those in the Unified Medical Language System (UMLS), combining transfer learning with self-training enables a NER model such as a BiLSTM-CRF or BERT to obtain similar performance with the same model trained on 3x-8x the amount of labeled data. We further show that our approach can also boost performance in a low-resource application where entities types are more rare and not specifically covered in UMLS. |
format | Online Article Text |
id | pubmed-7872256 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2021 |
publisher | Public Library of Science |
record_format | MEDLINE/PubMed |
spelling | pubmed-78722562021-02-19 A pre-training and self-training approach for biomedical named entity recognition Gao, Shang Kotevska, Olivera Sorokine, Alexandre Christian, J. Blair PLoS One Research Article Named entity recognition (NER) is a key component of many scientific literature mining tasks, such as information retrieval, information extraction, and question answering; however, many modern approaches require large amounts of labeled training data in order to be effective. This severely limits the effectiveness of NER models in applications where expert annotations are difficult and expensive to obtain. In this work, we explore the effectiveness of transfer learning and semi-supervised self-training to improve the performance of NER models in biomedical settings with very limited labeled data (250-2000 labeled samples). We first pre-train a BiLSTM-CRF and a BERT model on a very large general biomedical NER corpus such as MedMentions or Semantic Medline, and then we fine-tune the model on a more specific target NER task that has very limited training data; finally, we apply semi-supervised self-training using unlabeled data to further boost model performance. We show that in NER tasks that focus on common biomedical entity types such as those in the Unified Medical Language System (UMLS), combining transfer learning with self-training enables a NER model such as a BiLSTM-CRF or BERT to obtain similar performance with the same model trained on 3x-8x the amount of labeled data. We further show that our approach can also boost performance in a low-resource application where entities types are more rare and not specifically covered in UMLS. Public Library of Science 2021-02-09 /pmc/articles/PMC7872256/ /pubmed/33561139 http://dx.doi.org/10.1371/journal.pone.0246310 Text en https://creativecommons.org/publicdomain/zero/1.0/ This is an open access article, free of all copyright, and may be freely reproduced, distributed, transmitted, modified, built upon, or otherwise used by anyone for any lawful purpose. The work is made available under the Creative Commons CC0 (https://creativecommons.org/publicdomain/zero/1.0/) public domain dedication. |
spellingShingle | Research Article Gao, Shang Kotevska, Olivera Sorokine, Alexandre Christian, J. Blair A pre-training and self-training approach for biomedical named entity recognition |
title | A pre-training and self-training approach for biomedical named entity recognition |
title_full | A pre-training and self-training approach for biomedical named entity recognition |
title_fullStr | A pre-training and self-training approach for biomedical named entity recognition |
title_full_unstemmed | A pre-training and self-training approach for biomedical named entity recognition |
title_short | A pre-training and self-training approach for biomedical named entity recognition |
title_sort | pre-training and self-training approach for biomedical named entity recognition |
topic | Research Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7872256/ https://www.ncbi.nlm.nih.gov/pubmed/33561139 http://dx.doi.org/10.1371/journal.pone.0246310 |
work_keys_str_mv | AT gaoshang apretrainingandselftrainingapproachforbiomedicalnamedentityrecognition AT kotevskaolivera apretrainingandselftrainingapproachforbiomedicalnamedentityrecognition AT sorokinealexandre apretrainingandselftrainingapproachforbiomedicalnamedentityrecognition AT christianjblair apretrainingandselftrainingapproachforbiomedicalnamedentityrecognition AT gaoshang pretrainingandselftrainingapproachforbiomedicalnamedentityrecognition AT kotevskaolivera pretrainingandselftrainingapproachforbiomedicalnamedentityrecognition AT sorokinealexandre pretrainingandselftrainingapproachforbiomedicalnamedentityrecognition AT christianjblair pretrainingandselftrainingapproachforbiomedicalnamedentityrecognition |