Cargando…
Fine-Tuning Bidirectional Encoder Representations From Transformers (BERT)–Based Models on Large-Scale Electronic Health Record Notes: An Empirical Study
BACKGROUND: The bidirectional encoder representations from transformers (BERT) model has achieved great success in many natural language processing (NLP) tasks, such as named entity recognition and question answering. However, little prior work has explored this model to be used for an important tas...
Autores principales: | Li, Fei, Jin, Yonghao, Liu, Weisong, Rawat, Bhanu Pratap Singh, Cai, Pengshan, Yu, Hong |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
JMIR Publications
2019
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6746103/ https://www.ncbi.nlm.nih.gov/pubmed/31516126 http://dx.doi.org/10.2196/14830 |
Ejemplares similares
-
Transfer Learning for Sentiment Classification Using Bidirectional Encoder Representations from Transformers (BERT) Model
por: Areshey, Ali, et al.
Publicado: (2023) -
BERT-Kgly: A Bidirectional Encoder Representations From Transformers (BERT)-Based Model for Predicting Lysine Glycation Site for Homo sapiens
por: Liu, Yinbo, et al.
Publicado: (2022) -
Do syntactic trees enhance Bidirectional Encoder Representations from Transformers (BERT) models for chemical–drug relation extraction?
por: Tang, Anfu, et al.
Publicado: (2022) -
Adapting Bidirectional Encoder Representations from Transformers (BERT) to Assess Clinical Semantic Textual Similarity: Algorithm Development and Validation Study
por: Kades, Klaus, et al.
Publicado: (2021) -
A Fine-Tuned Bidirectional Encoder Representations From Transformers Model for Food Named-Entity Recognition: Algorithm Development and Validation
por: Stojanov, Riste, et al.
Publicado: (2021)