Cargando…

COVID-19 outbreak prediction using Seq2Seq + Attention and Word2Vec keyword time series data

As of 2022, COVID-19, first reported in Wuhan, China, in November 2019, has become a worldwide epidemic, causing numerous infections and casualties and enormous social and economic damage. To mitigate its impact, various COVID-19 prediction studies have emerged, most of them using mathematical model...

Descripción completa

Detalles Bibliográficos
Autores principales: Kim, Yeongha, Park, Chang-Reung, Ahn, Jae-Pyoung, Jang, Beakcheol
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Public Library of Science 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10132639/
https://www.ncbi.nlm.nih.gov/pubmed/37099535
http://dx.doi.org/10.1371/journal.pone.0284298
Descripción
Sumario:As of 2022, COVID-19, first reported in Wuhan, China, in November 2019, has become a worldwide epidemic, causing numerous infections and casualties and enormous social and economic damage. To mitigate its impact, various COVID-19 prediction studies have emerged, most of them using mathematical models and artificial intelligence for prediction. However, the problem with these models is that their prediction accuracy is considerably reduced when the duration of the COVID-19 outbreak is short. In this paper, we propose a new prediction method combining Word2Vec and the existing long short-term memory and Seq2Seq + Attention model. We compare the prediction error of the existing and proposed models with the COVID-19 prediction results reported from five US states: California, Texas, Florida, New York, and Illinois. The results of the experiment show that the proposed model combining Word2Vec and the existing long short-term memory and Seq2Seq + Attention achieves better prediction results and lower errors than the existing long short-term memory and Seq2Seq + Attention models. In experiments, the Pearson correlation coefficient increased by 0.05 to 0.21 and the RMSE decreased by 0.03 to 0.08 compared to the existing method.