Cargando…
Modified Bidirectional Encoder Representations From Transformers Extractive Summarization Model for Hospital Information Systems Based on Character-Level Tokens (AlphaBERT): Development and Performance Evaluation
BACKGROUND: Doctors must care for many patients simultaneously, and it is time-consuming to find and examine all patients’ medical histories. Discharge diagnoses provide hospital staff with sufficient information to enable handling multiple patients; however, the excessive amount of words in the dia...
Autores principales: | Chen, Yen-Pin, Chen, Yi-Ying, Lin, Jr-Jiun, Huang, Chien-Hua, Lai, Feipei |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
JMIR Publications
2020
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7221648/ https://www.ncbi.nlm.nih.gov/pubmed/32347806 http://dx.doi.org/10.2196/17787 |
Ejemplares similares
-
BERT-Kgly: A Bidirectional Encoder Representations From Transformers (BERT)-Based Model for Predicting Lysine Glycation Site for Homo sapiens
por: Liu, Yinbo, et al.
Publicado: (2022) -
Transfer Learning for Sentiment Classification Using Bidirectional Encoder Representations from Transformers (BERT) Model
por: Areshey, Ali, et al.
Publicado: (2023) -
CharAs-CBert: Character Assist Construction-Bert Sentence Representation Improving Sentiment Classification
por: Chen, Bo, et al.
Publicado: (2022) -
Do syntactic trees enhance Bidirectional Encoder Representations from Transformers (BERT) models for chemical–drug relation extraction?
por: Tang, Anfu, et al.
Publicado: (2022) -
Shapley Idioms: Analysing BERT Sentence Embeddings for General Idiom Token Identification
por: Nedumpozhimana, Vasudevan, et al.
Publicado: (2022)