Cargando…
Pretrained transformer framework on pediatric claims data for population specific tasks
The adoption of electronic health records (EHR) has become universal during the past decade, which has afforded in-depth data-based research. By learning from the large amount of healthcare data, various data-driven models have been built to predict future events for different medical tasks, such as...
Autores principales: | Zeng, Xianlong, Linwood, Simon L., Liu, Chang |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Nature Publishing Group UK
2022
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8901645/ https://www.ncbi.nlm.nih.gov/pubmed/35256645 http://dx.doi.org/10.1038/s41598-022-07545-1 |
Ejemplares similares
-
Operationalizing and Implementing Pretrained, Large Artificial Intelligence Linguistic Models in the US Health Care System: Outlook of Generative Pretrained Transformer 3 (GPT-3) as a Service Model
por: Sezgin, Emre, et al.
Publicado: (2022) -
EpiGePT: a Pretrained Transformer model for epigenomics
por: Gao, Zijing, et al.
Publicado: (2023) -
Neural Data Transformer 2: Multi-context Pretraining for Neural Spiking Activity
por: Ye, Joel, et al.
Publicado: (2023) -
Medical image captioning via generative pretrained transformers
por: Selivanov, Alexander, et al.
Publicado: (2023) -
To pretrain or not? A systematic analysis of the benefits of pretraining in diabetic retinopathy
por: Srinivasan, Vignesh, et al.
Publicado: (2022)