Cargando…

A comparison of attentional neural network architectures for modeling with electronic medical records

OBJECTIVE: Attention networks learn an intelligent weighted averaging mechanism over a series of entities, providing increases to both performance and interpretability. In this article, we propose a novel time-aware transformer-based network and compare it to another leading model with similar chara...

Descripción completa

Detalles Bibliográficos
Autores principales: Finch, Anthony, Crowell, Alexander, Chang, Yung-Chieh, Parameshwarappa, Pooja, Martinez, Jose, Horberg, Michael
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Oxford University Press 2021
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8358476/
https://www.ncbi.nlm.nih.gov/pubmed/34396057
http://dx.doi.org/10.1093/jamiaopen/ooab064
_version_ 1783737348321705984
author Finch, Anthony
Crowell, Alexander
Chang, Yung-Chieh
Parameshwarappa, Pooja
Martinez, Jose
Horberg, Michael
author_facet Finch, Anthony
Crowell, Alexander
Chang, Yung-Chieh
Parameshwarappa, Pooja
Martinez, Jose
Horberg, Michael
author_sort Finch, Anthony
collection PubMed
description OBJECTIVE: Attention networks learn an intelligent weighted averaging mechanism over a series of entities, providing increases to both performance and interpretability. In this article, we propose a novel time-aware transformer-based network and compare it to another leading model with similar characteristics. We also decompose model performance along several critical axes and examine which features contribute most to our model’s performance. MATERIALS AND METHODS: Using data sets representing patient records obtained between 2017 and 2019 by the Kaiser Permanente Mid-Atlantic States medical system, we construct four attentional models with varying levels of complexity on two targets (patient mortality and hospitalization). We examine how incorporating transfer learning and demographic features contribute to model success. We also test the performance of a model proposed in recent medical modeling literature. We compare these models with out-of-sample data using the area under the receiver-operator characteristic (AUROC) curve and average precision as measures of performance. We also analyze the attentional weights assigned by these models to patient diagnoses. RESULTS: We found that our model significantly outperformed the alternative on a mortality prediction task (91.96% AUROC against 73.82% AUROC). Our model also outperformed on the hospitalization task, although the models were significantly more competitive in that space (82.41% AUROC against 80.33% AUROC). Furthermore, we found that demographic features and transfer learning features which are frequently omitted from new models proposed in the EMR modeling space contributed significantly to the success of our model. DISCUSSION: We proposed an original construction of deep learning electronic medical record models which achieved very strong performance. We found that our unique model construction outperformed on several tasks in comparison to a leading literature alternative, even when input data was held constant between them. We obtained further improvements by incorporating several methods that are frequently overlooked in new model proposals, suggesting that it will be useful to explore these options further in the future.
format Online
Article
Text
id pubmed-8358476
institution National Center for Biotechnology Information
language English
publishDate 2021
publisher Oxford University Press
record_format MEDLINE/PubMed
spelling pubmed-83584762021-08-12 A comparison of attentional neural network architectures for modeling with electronic medical records Finch, Anthony Crowell, Alexander Chang, Yung-Chieh Parameshwarappa, Pooja Martinez, Jose Horberg, Michael JAMIA Open Research and Applications OBJECTIVE: Attention networks learn an intelligent weighted averaging mechanism over a series of entities, providing increases to both performance and interpretability. In this article, we propose a novel time-aware transformer-based network and compare it to another leading model with similar characteristics. We also decompose model performance along several critical axes and examine which features contribute most to our model’s performance. MATERIALS AND METHODS: Using data sets representing patient records obtained between 2017 and 2019 by the Kaiser Permanente Mid-Atlantic States medical system, we construct four attentional models with varying levels of complexity on two targets (patient mortality and hospitalization). We examine how incorporating transfer learning and demographic features contribute to model success. We also test the performance of a model proposed in recent medical modeling literature. We compare these models with out-of-sample data using the area under the receiver-operator characteristic (AUROC) curve and average precision as measures of performance. We also analyze the attentional weights assigned by these models to patient diagnoses. RESULTS: We found that our model significantly outperformed the alternative on a mortality prediction task (91.96% AUROC against 73.82% AUROC). Our model also outperformed on the hospitalization task, although the models were significantly more competitive in that space (82.41% AUROC against 80.33% AUROC). Furthermore, we found that demographic features and transfer learning features which are frequently omitted from new models proposed in the EMR modeling space contributed significantly to the success of our model. DISCUSSION: We proposed an original construction of deep learning electronic medical record models which achieved very strong performance. We found that our unique model construction outperformed on several tasks in comparison to a leading literature alternative, even when input data was held constant between them. We obtained further improvements by incorporating several methods that are frequently overlooked in new model proposals, suggesting that it will be useful to explore these options further in the future. Oxford University Press 2021-08-12 /pmc/articles/PMC8358476/ /pubmed/34396057 http://dx.doi.org/10.1093/jamiaopen/ooab064 Text en © The Author(s) 2021. Published by Oxford University Press on behalf of the American Medical Informatics Association. https://creativecommons.org/licenses/by-nc/4.0/This is an Open Access article distributed under the terms of the Creative Commons Attribution Non-Commercial License (http://creativecommons.org/licenses/by-nc/4.0/ (https://creativecommons.org/licenses/by-nc/4.0/) ), which permits non-commercial re-use, distribution, and reproduction in any medium, provided the original work is properly cited. For commercial re-use, please contact journals.permissions@oup.com
spellingShingle Research and Applications
Finch, Anthony
Crowell, Alexander
Chang, Yung-Chieh
Parameshwarappa, Pooja
Martinez, Jose
Horberg, Michael
A comparison of attentional neural network architectures for modeling with electronic medical records
title A comparison of attentional neural network architectures for modeling with electronic medical records
title_full A comparison of attentional neural network architectures for modeling with electronic medical records
title_fullStr A comparison of attentional neural network architectures for modeling with electronic medical records
title_full_unstemmed A comparison of attentional neural network architectures for modeling with electronic medical records
title_short A comparison of attentional neural network architectures for modeling with electronic medical records
title_sort comparison of attentional neural network architectures for modeling with electronic medical records
topic Research and Applications
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8358476/
https://www.ncbi.nlm.nih.gov/pubmed/34396057
http://dx.doi.org/10.1093/jamiaopen/ooab064
work_keys_str_mv AT finchanthony acomparisonofattentionalneuralnetworkarchitecturesformodelingwithelectronicmedicalrecords
AT crowellalexander acomparisonofattentionalneuralnetworkarchitecturesformodelingwithelectronicmedicalrecords
AT changyungchieh acomparisonofattentionalneuralnetworkarchitecturesformodelingwithelectronicmedicalrecords
AT parameshwarappapooja acomparisonofattentionalneuralnetworkarchitecturesformodelingwithelectronicmedicalrecords
AT martinezjose acomparisonofattentionalneuralnetworkarchitecturesformodelingwithelectronicmedicalrecords
AT horbergmichael acomparisonofattentionalneuralnetworkarchitecturesformodelingwithelectronicmedicalrecords
AT finchanthony comparisonofattentionalneuralnetworkarchitecturesformodelingwithelectronicmedicalrecords
AT crowellalexander comparisonofattentionalneuralnetworkarchitecturesformodelingwithelectronicmedicalrecords
AT changyungchieh comparisonofattentionalneuralnetworkarchitecturesformodelingwithelectronicmedicalrecords
AT parameshwarappapooja comparisonofattentionalneuralnetworkarchitecturesformodelingwithelectronicmedicalrecords
AT martinezjose comparisonofattentionalneuralnetworkarchitecturesformodelingwithelectronicmedicalrecords
AT horbergmichael comparisonofattentionalneuralnetworkarchitecturesformodelingwithelectronicmedicalrecords