Cargando…

Exploiting time series of Sentinel-1 and Sentinel-2 to detect grassland mowing events using deep learning with reject region

Governments pay agencies to control the activities of farmers who receive governmental support. Field visits are costly and highly time-consuming; hence remote sensing is widely used for monitoring farmers’ activities. Nowadays, a vast amount of available data from the Sentinel mission significantly...

Descripción completa

Detalles Bibliográficos
Autores principales: Komisarenko, Viacheslav, Voormansik, Kaupo, Elshawi, Radwa, Sakr, Sherif
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Nature Publishing Group UK 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8770799/
https://www.ncbi.nlm.nih.gov/pubmed/35046488
http://dx.doi.org/10.1038/s41598-022-04932-6
Descripción
Sumario:Governments pay agencies to control the activities of farmers who receive governmental support. Field visits are costly and highly time-consuming; hence remote sensing is widely used for monitoring farmers’ activities. Nowadays, a vast amount of available data from the Sentinel mission significantly boosted research in agriculture. Estonia is among the first countries to take advantage of this data source to automate mowing and ploughing events detection across the country. Although techniques that rely on optical data for monitoring agriculture events are favourable, the availability of such data during the growing season is limited. Thus, alternative data sources have to be evaluated. In this paper, we developed a deep learning model with an integrated reject option for detecting grassland mowing events using time series of Sentinel-1 and Sentinel-2 optical images acquired from 2000 fields in Estonia in 2018 during the vegetative season. The rejection mechanism is based on a threshold over the prediction confidence of the proposed model. The proposed model significantly outperforms the state-of-the-art technique and achieves event accuracy of 73.3% and end of season accuracy of 94.8%.