Cargando…

Pretrained transformer framework on pediatric claims data for population specific tasks

The adoption of electronic health records (EHR) has become universal during the past decade, which has afforded in-depth data-based research. By learning from the large amount of healthcare data, various data-driven models have been built to predict future events for different medical tasks, such as...

Descripción completa

Detalles Bibliográficos
Autores principales: Zeng, Xianlong, Linwood, Simon L., Liu, Chang
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Nature Publishing Group UK 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8901645/
https://www.ncbi.nlm.nih.gov/pubmed/35256645
http://dx.doi.org/10.1038/s41598-022-07545-1
_version_ 1784664410903543808
author Zeng, Xianlong
Linwood, Simon L.
Liu, Chang
author_facet Zeng, Xianlong
Linwood, Simon L.
Liu, Chang
author_sort Zeng, Xianlong
collection PubMed
description The adoption of electronic health records (EHR) has become universal during the past decade, which has afforded in-depth data-based research. By learning from the large amount of healthcare data, various data-driven models have been built to predict future events for different medical tasks, such as auto diagnosis and heart-attack prediction. Although EHR is abundant, the population that satisfies specific criteria for learning population-specific tasks is scarce, making it challenging to train data-hungry deep learning models. This study presents the Claim Pre-Training (Claim-PT) framework, a generic pre-training model that first trains on the entire pediatric claims dataset, followed by a discriminative fine-tuning on each population-specific task. The semantic meaning of medical events can be captured in the pre-training stage, and the effective knowledge transfer is completed through the task-aware fine-tuning stage. The fine-tuning process requires minimal parameter modification without changing the model architecture, which mitigates the data scarcity issue and helps train the deep learning model adequately on small patient cohorts. We conducted experiments on a real-world pediatric dataset with more than one million patient records. Experimental results on two downstream tasks demonstrated the effectiveness of our method: our general task-agnostic pre-training framework outperformed tailored task-specific models, achieving more than 10% higher in model performance as compared to baselines. In addition, our framework showed a potential to transfer learned knowledge from one institution to another, which may pave the way for future healthcare model pre-training across institutions.
format Online
Article
Text
id pubmed-8901645
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher Nature Publishing Group UK
record_format MEDLINE/PubMed
spelling pubmed-89016452022-03-08 Pretrained transformer framework on pediatric claims data for population specific tasks Zeng, Xianlong Linwood, Simon L. Liu, Chang Sci Rep Article The adoption of electronic health records (EHR) has become universal during the past decade, which has afforded in-depth data-based research. By learning from the large amount of healthcare data, various data-driven models have been built to predict future events for different medical tasks, such as auto diagnosis and heart-attack prediction. Although EHR is abundant, the population that satisfies specific criteria for learning population-specific tasks is scarce, making it challenging to train data-hungry deep learning models. This study presents the Claim Pre-Training (Claim-PT) framework, a generic pre-training model that first trains on the entire pediatric claims dataset, followed by a discriminative fine-tuning on each population-specific task. The semantic meaning of medical events can be captured in the pre-training stage, and the effective knowledge transfer is completed through the task-aware fine-tuning stage. The fine-tuning process requires minimal parameter modification without changing the model architecture, which mitigates the data scarcity issue and helps train the deep learning model adequately on small patient cohorts. We conducted experiments on a real-world pediatric dataset with more than one million patient records. Experimental results on two downstream tasks demonstrated the effectiveness of our method: our general task-agnostic pre-training framework outperformed tailored task-specific models, achieving more than 10% higher in model performance as compared to baselines. In addition, our framework showed a potential to transfer learned knowledge from one institution to another, which may pave the way for future healthcare model pre-training across institutions. Nature Publishing Group UK 2022-03-07 /pmc/articles/PMC8901645/ /pubmed/35256645 http://dx.doi.org/10.1038/s41598-022-07545-1 Text en © The Author(s) 2022 https://creativecommons.org/licenses/by/4.0/Open AccessThis article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) .
spellingShingle Article
Zeng, Xianlong
Linwood, Simon L.
Liu, Chang
Pretrained transformer framework on pediatric claims data for population specific tasks
title Pretrained transformer framework on pediatric claims data for population specific tasks
title_full Pretrained transformer framework on pediatric claims data for population specific tasks
title_fullStr Pretrained transformer framework on pediatric claims data for population specific tasks
title_full_unstemmed Pretrained transformer framework on pediatric claims data for population specific tasks
title_short Pretrained transformer framework on pediatric claims data for population specific tasks
title_sort pretrained transformer framework on pediatric claims data for population specific tasks
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8901645/
https://www.ncbi.nlm.nih.gov/pubmed/35256645
http://dx.doi.org/10.1038/s41598-022-07545-1
work_keys_str_mv AT zengxianlong pretrainedtransformerframeworkonpediatricclaimsdataforpopulationspecifictasks
AT linwoodsimonl pretrainedtransformerframeworkonpediatricclaimsdataforpopulationspecifictasks
AT liuchang pretrainedtransformerframeworkonpediatricclaimsdataforpopulationspecifictasks