Cargando…
Author Correction: A pre-trained BERT for Korean medical natural language processing
Autores principales: | Kim, Yoojoong, Kim, Jong-Ho, Lee, Jeong Moon, Jang, Moon Joung, Yum, Yun Jin, Kim, Seongtae, Shin, Unsub, Kim, Young-Min, Joo, Hyung Joon, Song, Sanghoun |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Nature Publishing Group UK
2023
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10247737/ https://www.ncbi.nlm.nih.gov/pubmed/37286626 http://dx.doi.org/10.1038/s41598-023-36519-0 |
Ejemplares similares
-
A pre-trained BERT for Korean medical natural language processing
por: Kim, Yoojoong, et al.
Publicado: (2022) -
A Word Pair Dataset for Semantic Similarity and Relatedness in Korean Medical Vocabulary: Reference Development and Validation
por: Yum, Yunjin, et al.
Publicado: (2021) -
Predicting medical specialty from text based on a domain-specific pre-trained BERT
por: Kim, Yoojoong, et al.
Publicado: (2023) -
Investigating a neural language model’s replicability of psycholinguistic experiments: A case study of NPI licensing
por: Shin, Unsub, et al.
Publicado: (2023) -
BioBERT: a pre-trained biomedical language representation model for biomedical text mining
por: Lee, Jinhyuk, et al.
Publicado: (2020)