Cargando…

Fusion neural networks for plant classification: learning to combine RGB, hyperspectral, and lidar data

Airborne remote sensing offers unprecedented opportunities to efficiently monitor vegetation, but methods to delineate and classify individual plant species using the collected data are still actively being developed and improved. The Integrating Data science with Trees and Remote Sensing (IDTReeS)...

Descripción completa

Detalles Bibliográficos
Autores principales: Scholl, Victoria M., McGlinchy, Joseph, Price-Broncucia, Teo, Balch, Jennifer K., Joseph, Maxwell B.
Formato: Online Artículo Texto
Lenguaje:English
Publicado: PeerJ Inc. 2021
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8325917/
https://www.ncbi.nlm.nih.gov/pubmed/34395073
http://dx.doi.org/10.7717/peerj.11790
_version_ 1783731654079021056
author Scholl, Victoria M.
McGlinchy, Joseph
Price-Broncucia, Teo
Balch, Jennifer K.
Joseph, Maxwell B.
author_facet Scholl, Victoria M.
McGlinchy, Joseph
Price-Broncucia, Teo
Balch, Jennifer K.
Joseph, Maxwell B.
author_sort Scholl, Victoria M.
collection PubMed
description Airborne remote sensing offers unprecedented opportunities to efficiently monitor vegetation, but methods to delineate and classify individual plant species using the collected data are still actively being developed and improved. The Integrating Data science with Trees and Remote Sensing (IDTReeS) plant identification competition openly invited scientists to create and compare individual tree mapping methods. Participants were tasked with training taxon identification algorithms based on two sites, to then transfer their methods to a third unseen site, using field-based plant observations in combination with airborne remote sensing image data products from the National Ecological Observatory Network (NEON). These data were captured by a high resolution digital camera sensitive to red, green, blue (RGB) light, hyperspectral imaging spectrometer spanning the visible to shortwave infrared wavelengths, and lidar systems to capture the spectral and structural properties of vegetation. As participants in the IDTReeS competition, we developed a two-stage deep learning approach to integrate NEON remote sensing data from all three sensors and classify individual plant species and genera. The first stage was a convolutional neural network that generates taxon probabilities from RGB images, and the second stage was a fusion neural network that “learns” how to combine these probabilities with hyperspectral and lidar data. Our two-stage approach leverages the ability of neural networks to flexibly and automatically extract descriptive features from complex image data with high dimensionality. Our method achieved an overall classification accuracy of 0.51 based on the training set, and 0.32 based on the test set which contained data from an unseen site with unknown taxa classes. Although transferability of classification algorithms to unseen sites with unknown species and genus classes proved to be a challenging task, developing methods with openly available NEON data that will be collected in a standardized format for 30 years allows for continual improvements and major gains for members of the computational ecology community. We outline promising directions related to data preparation and processing techniques for further investigation, and provide our code to contribute to open reproducible science efforts.
format Online
Article
Text
id pubmed-8325917
institution National Center for Biotechnology Information
language English
publishDate 2021
publisher PeerJ Inc.
record_format MEDLINE/PubMed
spelling pubmed-83259172021-08-13 Fusion neural networks for plant classification: learning to combine RGB, hyperspectral, and lidar data Scholl, Victoria M. McGlinchy, Joseph Price-Broncucia, Teo Balch, Jennifer K. Joseph, Maxwell B. PeerJ Ecology Airborne remote sensing offers unprecedented opportunities to efficiently monitor vegetation, but methods to delineate and classify individual plant species using the collected data are still actively being developed and improved. The Integrating Data science with Trees and Remote Sensing (IDTReeS) plant identification competition openly invited scientists to create and compare individual tree mapping methods. Participants were tasked with training taxon identification algorithms based on two sites, to then transfer their methods to a third unseen site, using field-based plant observations in combination with airborne remote sensing image data products from the National Ecological Observatory Network (NEON). These data were captured by a high resolution digital camera sensitive to red, green, blue (RGB) light, hyperspectral imaging spectrometer spanning the visible to shortwave infrared wavelengths, and lidar systems to capture the spectral and structural properties of vegetation. As participants in the IDTReeS competition, we developed a two-stage deep learning approach to integrate NEON remote sensing data from all three sensors and classify individual plant species and genera. The first stage was a convolutional neural network that generates taxon probabilities from RGB images, and the second stage was a fusion neural network that “learns” how to combine these probabilities with hyperspectral and lidar data. Our two-stage approach leverages the ability of neural networks to flexibly and automatically extract descriptive features from complex image data with high dimensionality. Our method achieved an overall classification accuracy of 0.51 based on the training set, and 0.32 based on the test set which contained data from an unseen site with unknown taxa classes. Although transferability of classification algorithms to unseen sites with unknown species and genus classes proved to be a challenging task, developing methods with openly available NEON data that will be collected in a standardized format for 30 years allows for continual improvements and major gains for members of the computational ecology community. We outline promising directions related to data preparation and processing techniques for further investigation, and provide our code to contribute to open reproducible science efforts. PeerJ Inc. 2021-07-29 /pmc/articles/PMC8325917/ /pubmed/34395073 http://dx.doi.org/10.7717/peerj.11790 Text en © 2021 Scholl et al. https://creativecommons.org/licenses/by/4.0/This is an open access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/) , which permits unrestricted use, distribution, reproduction and adaptation in any medium and for any purpose provided that it is properly attributed. For attribution, the original author(s), title, publication source (PeerJ) and either DOI or URL of the article must be cited.
spellingShingle Ecology
Scholl, Victoria M.
McGlinchy, Joseph
Price-Broncucia, Teo
Balch, Jennifer K.
Joseph, Maxwell B.
Fusion neural networks for plant classification: learning to combine RGB, hyperspectral, and lidar data
title Fusion neural networks for plant classification: learning to combine RGB, hyperspectral, and lidar data
title_full Fusion neural networks for plant classification: learning to combine RGB, hyperspectral, and lidar data
title_fullStr Fusion neural networks for plant classification: learning to combine RGB, hyperspectral, and lidar data
title_full_unstemmed Fusion neural networks for plant classification: learning to combine RGB, hyperspectral, and lidar data
title_short Fusion neural networks for plant classification: learning to combine RGB, hyperspectral, and lidar data
title_sort fusion neural networks for plant classification: learning to combine rgb, hyperspectral, and lidar data
topic Ecology
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8325917/
https://www.ncbi.nlm.nih.gov/pubmed/34395073
http://dx.doi.org/10.7717/peerj.11790
work_keys_str_mv AT schollvictoriam fusionneuralnetworksforplantclassificationlearningtocombinergbhyperspectralandlidardata
AT mcglinchyjoseph fusionneuralnetworksforplantclassificationlearningtocombinergbhyperspectralandlidardata
AT pricebroncuciateo fusionneuralnetworksforplantclassificationlearningtocombinergbhyperspectralandlidardata
AT balchjenniferk fusionneuralnetworksforplantclassificationlearningtocombinergbhyperspectralandlidardata
AT josephmaxwellb fusionneuralnetworksforplantclassificationlearningtocombinergbhyperspectralandlidardata