Cargando…
Use of synthetic images for training a deep learning model for weed detection and biomass estimation in cotton
Site-specific treatment of weeds in agricultural landscapes has been gaining importance in recent years due to economic savings and minimal impact on the environment. Different detection methods have been developed and tested for precision weed management systems, but recent developments in neural n...
Autores principales: | , , , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Nature Publishing Group UK
2022
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9666527/ https://www.ncbi.nlm.nih.gov/pubmed/36379963 http://dx.doi.org/10.1038/s41598-022-23399-z |
_version_ | 1784831533199130624 |
---|---|
author | Sapkota, Bishwa B. Popescu, Sorin Rajan, Nithya Leon, Ramon G. Reberg-Horton, Chris Mirsky, Steven Bagavathiannan, Muthukumar V. |
author_facet | Sapkota, Bishwa B. Popescu, Sorin Rajan, Nithya Leon, Ramon G. Reberg-Horton, Chris Mirsky, Steven Bagavathiannan, Muthukumar V. |
author_sort | Sapkota, Bishwa B. |
collection | PubMed |
description | Site-specific treatment of weeds in agricultural landscapes has been gaining importance in recent years due to economic savings and minimal impact on the environment. Different detection methods have been developed and tested for precision weed management systems, but recent developments in neural networks have offered great prospects. However, a major limitation with the neural network models is the requirement of high volumes of data for training. The current study aims at exploring an alternative approach to the use of real images to address this issue. In this study, synthetic images were generated with various strategies using plant instances clipped from UAV-borne real images. In addition, the Generative Adversarial Networks (GAN) technique was used to generate fake plant instances which were used in generating synthetic images. These images were used to train a powerful convolutional neural network (CNN) known as "Mask R-CNN" for weed detection and segmentation in a transfer learning mode. The study was conducted on morningglories (MG) and grass weeds (Grass) infested in cotton. The biomass for individual weeds was also collected in the field for biomass modeling using detection and segmentation results derived from model inference. Results showed a comparable performance between the real plant-based synthetic image (mean average precision for mask-mAP(m): 0.60; mean average precision for bounding box-mAP(b): 0.64) and real image datasets (mAP(m): 0.80; mAP(b): 0.81). However, the mixed dataset (real image + real plant instance-based synthetic image dataset) resulted in no performance gain for segmentation mask whereas a very small performance gain for bounding box (mAP(m): 0.80; mAP(b): 0.83). Around 40–50 plant instances were sufficient for generating synthetic images that resulted in optimal performance. Row orientation of cotton in the synthetic images was beneficial compared to random-orientation. Synthetic images generated with automatically-clipped plant instances performed similarly to the ones generated with manually-clipped instances. Generative Adversarial Networks-derived fake plant instances-based synthetic images did not perform as effectively as real plant instance-based synthetic images. The canopy mask area predicted weed biomass better than bounding box area with R(2) values of 0.66 and 0.46 for MG and Grass, respectively. The findings of this study offer valuable insights for guiding future endeavors oriented towards using synthetic images for weed detection and segmentation, and biomass estimation in row crops. |
format | Online Article Text |
id | pubmed-9666527 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2022 |
publisher | Nature Publishing Group UK |
record_format | MEDLINE/PubMed |
spelling | pubmed-96665272022-11-17 Use of synthetic images for training a deep learning model for weed detection and biomass estimation in cotton Sapkota, Bishwa B. Popescu, Sorin Rajan, Nithya Leon, Ramon G. Reberg-Horton, Chris Mirsky, Steven Bagavathiannan, Muthukumar V. Sci Rep Article Site-specific treatment of weeds in agricultural landscapes has been gaining importance in recent years due to economic savings and minimal impact on the environment. Different detection methods have been developed and tested for precision weed management systems, but recent developments in neural networks have offered great prospects. However, a major limitation with the neural network models is the requirement of high volumes of data for training. The current study aims at exploring an alternative approach to the use of real images to address this issue. In this study, synthetic images were generated with various strategies using plant instances clipped from UAV-borne real images. In addition, the Generative Adversarial Networks (GAN) technique was used to generate fake plant instances which were used in generating synthetic images. These images were used to train a powerful convolutional neural network (CNN) known as "Mask R-CNN" for weed detection and segmentation in a transfer learning mode. The study was conducted on morningglories (MG) and grass weeds (Grass) infested in cotton. The biomass for individual weeds was also collected in the field for biomass modeling using detection and segmentation results derived from model inference. Results showed a comparable performance between the real plant-based synthetic image (mean average precision for mask-mAP(m): 0.60; mean average precision for bounding box-mAP(b): 0.64) and real image datasets (mAP(m): 0.80; mAP(b): 0.81). However, the mixed dataset (real image + real plant instance-based synthetic image dataset) resulted in no performance gain for segmentation mask whereas a very small performance gain for bounding box (mAP(m): 0.80; mAP(b): 0.83). Around 40–50 plant instances were sufficient for generating synthetic images that resulted in optimal performance. Row orientation of cotton in the synthetic images was beneficial compared to random-orientation. Synthetic images generated with automatically-clipped plant instances performed similarly to the ones generated with manually-clipped instances. Generative Adversarial Networks-derived fake plant instances-based synthetic images did not perform as effectively as real plant instance-based synthetic images. The canopy mask area predicted weed biomass better than bounding box area with R(2) values of 0.66 and 0.46 for MG and Grass, respectively. The findings of this study offer valuable insights for guiding future endeavors oriented towards using synthetic images for weed detection and segmentation, and biomass estimation in row crops. Nature Publishing Group UK 2022-11-15 /pmc/articles/PMC9666527/ /pubmed/36379963 http://dx.doi.org/10.1038/s41598-022-23399-z Text en © The Author(s) 2022 https://creativecommons.org/licenses/by/4.0/Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) . |
spellingShingle | Article Sapkota, Bishwa B. Popescu, Sorin Rajan, Nithya Leon, Ramon G. Reberg-Horton, Chris Mirsky, Steven Bagavathiannan, Muthukumar V. Use of synthetic images for training a deep learning model for weed detection and biomass estimation in cotton |
title | Use of synthetic images for training a deep learning model for weed detection and biomass estimation in cotton |
title_full | Use of synthetic images for training a deep learning model for weed detection and biomass estimation in cotton |
title_fullStr | Use of synthetic images for training a deep learning model for weed detection and biomass estimation in cotton |
title_full_unstemmed | Use of synthetic images for training a deep learning model for weed detection and biomass estimation in cotton |
title_short | Use of synthetic images for training a deep learning model for weed detection and biomass estimation in cotton |
title_sort | use of synthetic images for training a deep learning model for weed detection and biomass estimation in cotton |
topic | Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9666527/ https://www.ncbi.nlm.nih.gov/pubmed/36379963 http://dx.doi.org/10.1038/s41598-022-23399-z |
work_keys_str_mv | AT sapkotabishwab useofsyntheticimagesfortrainingadeeplearningmodelforweeddetectionandbiomassestimationincotton AT popescusorin useofsyntheticimagesfortrainingadeeplearningmodelforweeddetectionandbiomassestimationincotton AT rajannithya useofsyntheticimagesfortrainingadeeplearningmodelforweeddetectionandbiomassestimationincotton AT leonramong useofsyntheticimagesfortrainingadeeplearningmodelforweeddetectionandbiomassestimationincotton AT reberghortonchris useofsyntheticimagesfortrainingadeeplearningmodelforweeddetectionandbiomassestimationincotton AT mirskysteven useofsyntheticimagesfortrainingadeeplearningmodelforweeddetectionandbiomassestimationincotton AT bagavathiannanmuthukumarv useofsyntheticimagesfortrainingadeeplearningmodelforweeddetectionandbiomassestimationincotton |