Cargando…
Adapting Bidirectional Encoder Representations from Transformers (BERT) to Assess Clinical Semantic Textual Similarity: Algorithm Development and Validation Study
BACKGROUND: Natural Language Understanding enables automatic extraction of relevant information from clinical text data, which are acquired every day in hospitals. In 2018, the language model Bidirectional Encoder Representations from Transformers (BERT) was introduced, generating new state-of-the-a...
Autores principales: | Kades, Klaus, Sellner, Jan, Koehler, Gregor, Full, Peter M, Lai, T Y Emmy, Kleesiek, Jens, Maier-Hein, Klaus H |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
JMIR Publications
2021
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7889424/ https://www.ncbi.nlm.nih.gov/pubmed/33533728 http://dx.doi.org/10.2196/22795 |
Ejemplares similares
-
Semantic Textual Similarity in Japanese Clinical Domain Texts Using BERT
por: Mutinda, Faith Wavinya, et al.
Publicado: (2021) -
Using Character-Level and Entity-Level Representations to Enhance Bidirectional Encoder Representation From Transformers-Based Clinical Semantic Textual Similarity Model: ClinicalSTS Modeling Study
por: Xiong, Ying, et al.
Publicado: (2020) -
Transfer Learning for Sentiment Classification Using Bidirectional Encoder Representations from Transformers (BERT) Model
por: Areshey, Ali, et al.
Publicado: (2023) -
BERT-Kgly: A Bidirectional Encoder Representations From Transformers (BERT)-Based Model for Predicting Lysine Glycation Site for Homo sapiens
por: Liu, Yinbo, et al.
Publicado: (2022) -
Distributed representation and one-hot representation fusion with gated network for clinical semantic textual similarity
por: Xiong, Ying, et al.
Publicado: (2020)