Cargando…
Improving Model Transferability for Clinical Note Section Classification Models Using Continued Pretraining
OBJECTIVE: The classification of clinical note sections is a critical step before doing more fine-grained natural language processing tasks such as social determinants of health extraction and temporal information extraction. Often, clinical note section classification models that achieve high accur...
Autores principales: | Zhou, Weipeng, Yetisgen, Meliha, Afshar, Majid, Gao, Yanjun, Savova, Guergana, Miller, Timothy A. |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Cold Spring Harbor Laboratory
2023
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10168403/ https://www.ncbi.nlm.nih.gov/pubmed/37162963 http://dx.doi.org/10.1101/2023.04.15.23288628 |
Ejemplares similares
-
Contrastive learning-based pretraining improves representation and transferability of diabetic retinopathy classification models
por: Alam, Minhaj Nur, et al.
Publicado: (2023) -
Comparison of Pretraining Models and Strategies for Health-Related Social Media Text Classification
por: Guo, Yuting, et al.
Publicado: (2022) -
An Improved Math Word Problem (MWP) Model Using Unified Pretrained Language Model (UniLM) for Pretraining
por: Zhang, Dongqiu, et al.
Publicado: (2022) -
Hotel Review Classification Based on the Text Pretraining Heterogeneous Graph Neural Network Model
por: Zhang, Liyan, et al.
Publicado: (2022) -
Prediction of Antifungal Activity of Antimicrobial Peptides by Transfer Learning from Protein Pretrained Models
por: Lobo, Fernando, et al.
Publicado: (2023)