Cargando…

Fine-Tuning Bidirectional Encoder Representations From Transformers (BERT)–Based Models on Large-Scale Electronic Health Record Notes: An Empirical Study

BACKGROUND: The bidirectional encoder representations from transformers (BERT) model has achieved great success in many natural language processing (NLP) tasks, such as named entity recognition and question answering. However, little prior work has explored this model to be used for an important tas...

Descripción completa

Detalles Bibliográficos
Autores principales: Li, Fei, Jin, Yonghao, Liu, Weisong, Rawat, Bhanu Pratap Singh, Cai, Pengshan, Yu, Hong
Formato: Online Artículo Texto
Lenguaje:English
Publicado: JMIR Publications 2019
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6746103/
https://www.ncbi.nlm.nih.gov/pubmed/31516126
http://dx.doi.org/10.2196/14830