Cargando…

ALeRT-COVID: Attentive Lockdown-awaRe Transfer Learning for Predicting COVID-19 Pandemics in Different Countries

Countries across the world are in different stages of COVID-19 trajectory, among which many have implemented lockdown measures to prevent its spread. Although the lockdown is effective in such prevention, it may put the economy into a depression. Predicting the epidemic progression with the governme...

Descripción completa

Detalles Bibliográficos
Autores principales: Li, Yingxue, Jia, Wenxiao, Wang, Junmei, Guo, Jianying, Liu, Qin, Li, Xiang, Xie, Guotong, Wang, Fei
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Springer International Publishing 2021
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7786857/
https://www.ncbi.nlm.nih.gov/pubmed/33426422
http://dx.doi.org/10.1007/s41666-020-00088-y
_version_ 1783632711376699392
author Li, Yingxue
Jia, Wenxiao
Wang, Junmei
Guo, Jianying
Liu, Qin
Li, Xiang
Xie, Guotong
Wang, Fei
author_facet Li, Yingxue
Jia, Wenxiao
Wang, Junmei
Guo, Jianying
Liu, Qin
Li, Xiang
Xie, Guotong
Wang, Fei
author_sort Li, Yingxue
collection PubMed
description Countries across the world are in different stages of COVID-19 trajectory, among which many have implemented lockdown measures to prevent its spread. Although the lockdown is effective in such prevention, it may put the economy into a depression. Predicting the epidemic progression with the government switching the lockdown on or off is critical. We propose a transfer learning approach called ALeRT-COVID using attention-based recurrent neural network (RNN) architecture to predict the epidemic trends for different countries. A source model was trained on the pre-defined source countries and then transferred to each target country. The lockdown measure was introduced to our model as a predictor and the attention mechanism was utilized to learn the different contributions of the confirmed cases in the past days to the future trend. Results demonstrated that the transfer learning strategy is helpful especially for early-stage countries. By introducing the lockdown predictor and the attention mechanism, ALeRT-COVID showed a significant improvement in the prediction performance. We predicted the confirmed cases in 1 week when extending and easing lockdown separately. Our results show that lockdown measures are still necessary for several countries. We expect our research can help different countries to make better decisions on the lockdown measures.
format Online
Article
Text
id pubmed-7786857
institution National Center for Biotechnology Information
language English
publishDate 2021
publisher Springer International Publishing
record_format MEDLINE/PubMed
spelling pubmed-77868572021-01-06 ALeRT-COVID: Attentive Lockdown-awaRe Transfer Learning for Predicting COVID-19 Pandemics in Different Countries Li, Yingxue Jia, Wenxiao Wang, Junmei Guo, Jianying Liu, Qin Li, Xiang Xie, Guotong Wang, Fei J Healthc Inform Res Research Article Countries across the world are in different stages of COVID-19 trajectory, among which many have implemented lockdown measures to prevent its spread. Although the lockdown is effective in such prevention, it may put the economy into a depression. Predicting the epidemic progression with the government switching the lockdown on or off is critical. We propose a transfer learning approach called ALeRT-COVID using attention-based recurrent neural network (RNN) architecture to predict the epidemic trends for different countries. A source model was trained on the pre-defined source countries and then transferred to each target country. The lockdown measure was introduced to our model as a predictor and the attention mechanism was utilized to learn the different contributions of the confirmed cases in the past days to the future trend. Results demonstrated that the transfer learning strategy is helpful especially for early-stage countries. By introducing the lockdown predictor and the attention mechanism, ALeRT-COVID showed a significant improvement in the prediction performance. We predicted the confirmed cases in 1 week when extending and easing lockdown separately. Our results show that lockdown measures are still necessary for several countries. We expect our research can help different countries to make better decisions on the lockdown measures. Springer International Publishing 2021-01-06 /pmc/articles/PMC7786857/ /pubmed/33426422 http://dx.doi.org/10.1007/s41666-020-00088-y Text en © The Author(s), under exclusive licence to Springer Nature Switzerland AG part of Springer Nature 2021
spellingShingle Research Article
Li, Yingxue
Jia, Wenxiao
Wang, Junmei
Guo, Jianying
Liu, Qin
Li, Xiang
Xie, Guotong
Wang, Fei
ALeRT-COVID: Attentive Lockdown-awaRe Transfer Learning for Predicting COVID-19 Pandemics in Different Countries
title ALeRT-COVID: Attentive Lockdown-awaRe Transfer Learning for Predicting COVID-19 Pandemics in Different Countries
title_full ALeRT-COVID: Attentive Lockdown-awaRe Transfer Learning for Predicting COVID-19 Pandemics in Different Countries
title_fullStr ALeRT-COVID: Attentive Lockdown-awaRe Transfer Learning for Predicting COVID-19 Pandemics in Different Countries
title_full_unstemmed ALeRT-COVID: Attentive Lockdown-awaRe Transfer Learning for Predicting COVID-19 Pandemics in Different Countries
title_short ALeRT-COVID: Attentive Lockdown-awaRe Transfer Learning for Predicting COVID-19 Pandemics in Different Countries
title_sort alert-covid: attentive lockdown-aware transfer learning for predicting covid-19 pandemics in different countries
topic Research Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7786857/
https://www.ncbi.nlm.nih.gov/pubmed/33426422
http://dx.doi.org/10.1007/s41666-020-00088-y
work_keys_str_mv AT liyingxue alertcovidattentivelockdownawaretransferlearningforpredictingcovid19pandemicsindifferentcountries
AT jiawenxiao alertcovidattentivelockdownawaretransferlearningforpredictingcovid19pandemicsindifferentcountries
AT wangjunmei alertcovidattentivelockdownawaretransferlearningforpredictingcovid19pandemicsindifferentcountries
AT guojianying alertcovidattentivelockdownawaretransferlearningforpredictingcovid19pandemicsindifferentcountries
AT liuqin alertcovidattentivelockdownawaretransferlearningforpredictingcovid19pandemicsindifferentcountries
AT lixiang alertcovidattentivelockdownawaretransferlearningforpredictingcovid19pandemicsindifferentcountries
AT xieguotong alertcovidattentivelockdownawaretransferlearningforpredictingcovid19pandemicsindifferentcountries
AT wangfei alertcovidattentivelockdownawaretransferlearningforpredictingcovid19pandemicsindifferentcountries