Cargando…
When BERT meets Bilbo: a learning curve analysis of pretrained language model on disease classification
BACKGROUND: Natural language processing (NLP) tasks in the health domain often deal with limited amount of labeled data due to high annotation costs and naturally rare observations. To compensate for the lack of training data, health NLP researchers often have to leverage knowledge and resources ext...
Autores principales: | Li, Xuedong, Yuan, Walter, Peng, Dezhong, Mei, Qiaozhu, Wang, Yue |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
BioMed Central
2022
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8981604/ https://www.ncbi.nlm.nih.gov/pubmed/35382811 http://dx.doi.org/10.1186/s12911-022-01829-2 |
Ejemplares similares
-
BatteryBERT: A Pretrained Language Model for Battery
Database Enhancement
por: Huang, Shu, et al.
Publicado: (2022) -
Improving rare disease classification using imperfect knowledge graph
por: Li, Xuedong, et al.
Publicado: (2019) -
Med-BERT: pretrained contextualized embeddings on large-scale structured electronic health records for disease prediction
por: Rasmy, Laila, et al.
Publicado: (2021) -
Towards Transfer Learning Techniques—BERT, DistilBERT, BERTimbau, and DistilBERTimbau for Automatic Text Classification from Different Languages: A Case Study
por: Silva Barbon, Rafael, et al.
Publicado: (2022) -
VGCN-BERT: Augmenting BERT with Graph Embedding for Text Classification
por: Lu, Zhibin, et al.
Publicado: (2020)