Cargando…

Evolution and forecasting of PM10 concentration at the Port of Gijon (Spain)

The name PM(10) refers to small particles with a diameter of less than 10 microns. The present research analyses different models capable of predicting PM(10) concentration using the previous values of PM(10), SO(2), NO, NO(2), CO and O(3) as input variables. The information for model training uses...

Descripción completa

Detalles Bibliográficos
Autores principales: Sánchez Lasheras, Fernando, García Nieto, Paulino José, García Gonzalo, Esperanza, Bonavera, Laura, de Cos Juez, Francisco Javier
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Nature Publishing Group UK 2020
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7366928/
https://www.ncbi.nlm.nih.gov/pubmed/32678178
http://dx.doi.org/10.1038/s41598-020-68636-5
Descripción
Sumario:The name PM(10) refers to small particles with a diameter of less than 10 microns. The present research analyses different models capable of predicting PM(10) concentration using the previous values of PM(10), SO(2), NO, NO(2), CO and O(3) as input variables. The information for model training uses data from January 2010 to December 2017. The models trained were autoregressive integrated moving average (ARIMA), vector autoregressive moving average (VARMA), multilayer perceptron neural networks (MLP), support vector machines as regressor (SVMR) and multivariate adaptive regression splines. Predictions were performed from 1 to 6 months in advance. The performance of the different models was measured in terms of root mean squared errors (RMSE). For forecasting 1 month ahead, the best results were obtained with the help of a SVMR model of six variables that gave a RMSE of 4.2649, but MLP results were very close, with a RMSE value of 4.3402. In the case of forecasts 6 months in advance, the best results correspond to an MLP model of six variables with a RMSE of 6.0873 followed by a SVMR also with six variables that gave an RMSE result of 6.1010. For forecasts both 1 and 6 months ahead, ARIMA outperformed VARMA models.