Cargando…
Protected Health Information Recognition by Fine-Tuning a Pre-training Transformer Model
OBJECTIVES: De-identifying protected health information (PHI) in medical documents is important, and a prerequisite to de-identification is the identification of PHI entity names in clinical documents. This study aimed to compare the performance of three pre-training models that have recently attrac...
Autores principales: | Oh, Seo Hyun, Kang, Min, Lee, Youngho |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Korean Society of Medical Informatics
2022
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8850174/ https://www.ncbi.nlm.nih.gov/pubmed/35172087 http://dx.doi.org/10.4258/hir.2022.28.1.16 |
Ejemplares similares
-
Mechanics to pre-process information for the fine tuning of mechanoreceptors
por: Barth, Friedrich G.
Publicado: (2019) -
Facial Expression Recognition Based on Fine-Tuned Channel–Spatial Attention Transformer
por: Yao, Huang, et al.
Publicado: (2023) -
Advancing Brain Tumor Classification through Fine-Tuned Vision Transformers: A Comparative Study of Pre-Trained Models
por: Asiri, Abdullah A., et al.
Publicado: (2023) -
Investigation of improving the pre-training and fine-tuning of BERT model for biomedical relation extraction
por: Su, Peng, et al.
Publicado: (2022) -
An efficient ptychography reconstruction strategy through fine-tuning of large pre-trained deep learning model
por: Pan, Xinyu, et al.
Publicado: (2023)