Cargando…
A clinical specific BERT developed using a huge Japanese clinical text corpus
Generalized language models that are pre-trained with a large corpus have achieved great performance on natural language tasks. While many pre-trained transformers for English are published, few models are available for Japanese text, especially in clinical medicine. In this work, we demonstrate the...
Autores principales: | Kawazoe, Yoshimasa, Shibata, Daisaku, Shinohara, Emiko, Aramaki, Eiji, Ohe, Kazuhiko |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Public Library of Science
2021
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8577751/ https://www.ncbi.nlm.nih.gov/pubmed/34752490 http://dx.doi.org/10.1371/journal.pone.0259763 |
Ejemplares similares
-
Semantic Textual Similarity in Japanese Clinical Domain Texts Using BERT
por: Mutinda, Faith Wavinya, et al.
Publicado: (2021) -
Impact of a Clinical Text–Based Fall Prediction Model on Preventing Extended Hospital Stays for Elderly Inpatients: Model Development and Performance Evaluation
por: Kawazoe, Yoshimasa, et al.
Publicado: (2022) -
Influence of Tweets Indicating False Rumors on COVID-19 Vaccination: Case Study
por: Hirabayashi, Mai, et al.
Publicado: (2023) -
VGCN-BERT: Augmenting BERT with Graph Embedding for Text Classification
por: Lu, Zhibin, et al.
Publicado: (2020) -
Korean clinical entity recognition from diagnosis text using BERT
por: Kim, Young-Min, et al.
Publicado: (2020)