Cargando…
UAV Multisensory Data Fusion and Multi-Task Deep Learning for High-Throughput Maize Phenotyping
Recent advances in unmanned aerial vehicles (UAV), mini and mobile sensors, and GeoAI (a blend of geospatial and artificial intelligence (AI) research) are the main highlights among agricultural innovations to improve crop productivity and thus secure vulnerable food systems. This study investigated...
Autores principales: | , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
MDPI
2023
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9965167/ https://www.ncbi.nlm.nih.gov/pubmed/36850425 http://dx.doi.org/10.3390/s23041827 |
_version_ | 1784896690568822784 |
---|---|
author | Nguyen, Canh Sagan, Vasit Bhadra, Sourav Moose, Stephen |
author_facet | Nguyen, Canh Sagan, Vasit Bhadra, Sourav Moose, Stephen |
author_sort | Nguyen, Canh |
collection | PubMed |
description | Recent advances in unmanned aerial vehicles (UAV), mini and mobile sensors, and GeoAI (a blend of geospatial and artificial intelligence (AI) research) are the main highlights among agricultural innovations to improve crop productivity and thus secure vulnerable food systems. This study investigated the versatility of UAV-borne multisensory data fusion within a framework of multi-task deep learning for high-throughput phenotyping in maize. UAVs equipped with a set of miniaturized sensors including hyperspectral, thermal, and LiDAR were collected in an experimental corn field in Urbana, IL, USA during the growing season. A full suite of eight phenotypes was in situ measured at the end of the season for ground truth data, specifically, dry stalk biomass, cob biomass, dry grain yield, harvest index, grain nitrogen utilization efficiency (Grain NutE), grain nitrogen content, total plant nitrogen content, and grain density. After being funneled through a series of radiometric calibrations and geo-corrections, the aerial data were analytically processed in three primary approaches. First, an extended version normalized difference spectral index (NDSI) served as a simple arithmetic combination of different data modalities to explore the correlation degree with maize phenotypes. The extended NDSI analysis revealed the NIR spectra (750–1000 nm) alone in a strong relation with all of eight maize traits. Second, a fusion of vegetation indices, structural indices, and thermal index selectively handcrafted from each data modality was fed to classical machine learning regressors, Support Vector Machine (SVM) and Random Forest (RF). The prediction performance varied from phenotype to phenotype, ranging from R(2) = 0.34 for grain density up to R(2) = 0.85 for both grain nitrogen content and total plant nitrogen content. Further, a fusion of hyperspectral and LiDAR data completely exceeded limitations of single data modality, especially addressing the vegetation saturation effect occurring in optical remote sensing. Third, a multi-task deep convolutional neural network (CNN) was customized to take a raw imagery data fusion of hyperspectral, thermal, and LiDAR for multi-predictions of maize traits at a time. The multi-task deep learning performed predictions comparably, if not better in some traits, with the mono-task deep learning and machine learning regressors. Data augmentation used for the deep learning models boosted the prediction accuracy, which helps to alleviate the intrinsic limitation of a small sample size and unbalanced sample classes in remote sensing research. Theoretical and practical implications to plant breeders and crop growers were also made explicit during discussions in the studies. |
format | Online Article Text |
id | pubmed-9965167 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2023 |
publisher | MDPI |
record_format | MEDLINE/PubMed |
spelling | pubmed-99651672023-02-26 UAV Multisensory Data Fusion and Multi-Task Deep Learning for High-Throughput Maize Phenotyping Nguyen, Canh Sagan, Vasit Bhadra, Sourav Moose, Stephen Sensors (Basel) Article Recent advances in unmanned aerial vehicles (UAV), mini and mobile sensors, and GeoAI (a blend of geospatial and artificial intelligence (AI) research) are the main highlights among agricultural innovations to improve crop productivity and thus secure vulnerable food systems. This study investigated the versatility of UAV-borne multisensory data fusion within a framework of multi-task deep learning for high-throughput phenotyping in maize. UAVs equipped with a set of miniaturized sensors including hyperspectral, thermal, and LiDAR were collected in an experimental corn field in Urbana, IL, USA during the growing season. A full suite of eight phenotypes was in situ measured at the end of the season for ground truth data, specifically, dry stalk biomass, cob biomass, dry grain yield, harvest index, grain nitrogen utilization efficiency (Grain NutE), grain nitrogen content, total plant nitrogen content, and grain density. After being funneled through a series of radiometric calibrations and geo-corrections, the aerial data were analytically processed in three primary approaches. First, an extended version normalized difference spectral index (NDSI) served as a simple arithmetic combination of different data modalities to explore the correlation degree with maize phenotypes. The extended NDSI analysis revealed the NIR spectra (750–1000 nm) alone in a strong relation with all of eight maize traits. Second, a fusion of vegetation indices, structural indices, and thermal index selectively handcrafted from each data modality was fed to classical machine learning regressors, Support Vector Machine (SVM) and Random Forest (RF). The prediction performance varied from phenotype to phenotype, ranging from R(2) = 0.34 for grain density up to R(2) = 0.85 for both grain nitrogen content and total plant nitrogen content. Further, a fusion of hyperspectral and LiDAR data completely exceeded limitations of single data modality, especially addressing the vegetation saturation effect occurring in optical remote sensing. Third, a multi-task deep convolutional neural network (CNN) was customized to take a raw imagery data fusion of hyperspectral, thermal, and LiDAR for multi-predictions of maize traits at a time. The multi-task deep learning performed predictions comparably, if not better in some traits, with the mono-task deep learning and machine learning regressors. Data augmentation used for the deep learning models boosted the prediction accuracy, which helps to alleviate the intrinsic limitation of a small sample size and unbalanced sample classes in remote sensing research. Theoretical and practical implications to plant breeders and crop growers were also made explicit during discussions in the studies. MDPI 2023-02-06 /pmc/articles/PMC9965167/ /pubmed/36850425 http://dx.doi.org/10.3390/s23041827 Text en © 2023 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). |
spellingShingle | Article Nguyen, Canh Sagan, Vasit Bhadra, Sourav Moose, Stephen UAV Multisensory Data Fusion and Multi-Task Deep Learning for High-Throughput Maize Phenotyping |
title | UAV Multisensory Data Fusion and Multi-Task Deep Learning for High-Throughput Maize Phenotyping |
title_full | UAV Multisensory Data Fusion and Multi-Task Deep Learning for High-Throughput Maize Phenotyping |
title_fullStr | UAV Multisensory Data Fusion and Multi-Task Deep Learning for High-Throughput Maize Phenotyping |
title_full_unstemmed | UAV Multisensory Data Fusion and Multi-Task Deep Learning for High-Throughput Maize Phenotyping |
title_short | UAV Multisensory Data Fusion and Multi-Task Deep Learning for High-Throughput Maize Phenotyping |
title_sort | uav multisensory data fusion and multi-task deep learning for high-throughput maize phenotyping |
topic | Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9965167/ https://www.ncbi.nlm.nih.gov/pubmed/36850425 http://dx.doi.org/10.3390/s23041827 |
work_keys_str_mv | AT nguyencanh uavmultisensorydatafusionandmultitaskdeeplearningforhighthroughputmaizephenotyping AT saganvasit uavmultisensorydatafusionandmultitaskdeeplearningforhighthroughputmaizephenotyping AT bhadrasourav uavmultisensorydatafusionandmultitaskdeeplearningforhighthroughputmaizephenotyping AT moosestephen uavmultisensorydatafusionandmultitaskdeeplearningforhighthroughputmaizephenotyping |