Cargando…

Multimodal Deep Learning and Visible-Light and Hyperspectral Imaging for Fruit Maturity Estimation

Fruit maturity is a critical factor in the supply chain, consumer preference, and agriculture industry. Most classification methods on fruit maturity identify only two classes: ripe and unripe, but this paper estimates six maturity stages of papaya fruit. Deep learning architectures have gained resp...

Descripción completa

Detalles Bibliográficos
Autores principales: Garillos-Manliguez, Cinmayii A., Chiang, John Y.
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2021
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7916978/
https://www.ncbi.nlm.nih.gov/pubmed/33670232
http://dx.doi.org/10.3390/s21041288
_version_ 1783657600916652032
author Garillos-Manliguez, Cinmayii A.
Chiang, John Y.
author_facet Garillos-Manliguez, Cinmayii A.
Chiang, John Y.
author_sort Garillos-Manliguez, Cinmayii A.
collection PubMed
description Fruit maturity is a critical factor in the supply chain, consumer preference, and agriculture industry. Most classification methods on fruit maturity identify only two classes: ripe and unripe, but this paper estimates six maturity stages of papaya fruit. Deep learning architectures have gained respect and brought breakthroughs in unimodal processing. This paper suggests a novel non-destructive and multimodal classification using deep convolutional neural networks that estimate fruit maturity by feature concatenation of data acquired from two imaging modes: visible-light and hyperspectral imaging systems. Morphological changes in the sample fruits can be easily measured with RGB images, while spectral signatures that provide high sensitivity and high correlation with the internal properties of fruits can be extracted from hyperspectral images with wavelength range in between 400 nm and 900 nm—factors that must be considered when building a model. This study further modified the architectures: AlexNet, VGG16, VGG19, ResNet50, ResNeXt50, MobileNet, and MobileNetV2 to utilize multimodal data cubes composed of RGB and hyperspectral data for sensitivity analyses. These multimodal variants can achieve up to 0.90 F1 scores and 1.45% top-2 error rate for the classification of six stages. Overall, taking advantage of multimodal input coupled with powerful deep convolutional neural network models can classify fruit maturity even at refined levels of six stages. This indicates that multimodal deep learning architectures and multimodal imaging have great potential for real-time in-field fruit maturity estimation that can help estimate optimal harvest time and other in-field industrial applications.
format Online
Article
Text
id pubmed-7916978
institution National Center for Biotechnology Information
language English
publishDate 2021
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-79169782021-03-01 Multimodal Deep Learning and Visible-Light and Hyperspectral Imaging for Fruit Maturity Estimation Garillos-Manliguez, Cinmayii A. Chiang, John Y. Sensors (Basel) Article Fruit maturity is a critical factor in the supply chain, consumer preference, and agriculture industry. Most classification methods on fruit maturity identify only two classes: ripe and unripe, but this paper estimates six maturity stages of papaya fruit. Deep learning architectures have gained respect and brought breakthroughs in unimodal processing. This paper suggests a novel non-destructive and multimodal classification using deep convolutional neural networks that estimate fruit maturity by feature concatenation of data acquired from two imaging modes: visible-light and hyperspectral imaging systems. Morphological changes in the sample fruits can be easily measured with RGB images, while spectral signatures that provide high sensitivity and high correlation with the internal properties of fruits can be extracted from hyperspectral images with wavelength range in between 400 nm and 900 nm—factors that must be considered when building a model. This study further modified the architectures: AlexNet, VGG16, VGG19, ResNet50, ResNeXt50, MobileNet, and MobileNetV2 to utilize multimodal data cubes composed of RGB and hyperspectral data for sensitivity analyses. These multimodal variants can achieve up to 0.90 F1 scores and 1.45% top-2 error rate for the classification of six stages. Overall, taking advantage of multimodal input coupled with powerful deep convolutional neural network models can classify fruit maturity even at refined levels of six stages. This indicates that multimodal deep learning architectures and multimodal imaging have great potential for real-time in-field fruit maturity estimation that can help estimate optimal harvest time and other in-field industrial applications. MDPI 2021-02-11 /pmc/articles/PMC7916978/ /pubmed/33670232 http://dx.doi.org/10.3390/s21041288 Text en © 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Garillos-Manliguez, Cinmayii A.
Chiang, John Y.
Multimodal Deep Learning and Visible-Light and Hyperspectral Imaging for Fruit Maturity Estimation
title Multimodal Deep Learning and Visible-Light and Hyperspectral Imaging for Fruit Maturity Estimation
title_full Multimodal Deep Learning and Visible-Light and Hyperspectral Imaging for Fruit Maturity Estimation
title_fullStr Multimodal Deep Learning and Visible-Light and Hyperspectral Imaging for Fruit Maturity Estimation
title_full_unstemmed Multimodal Deep Learning and Visible-Light and Hyperspectral Imaging for Fruit Maturity Estimation
title_short Multimodal Deep Learning and Visible-Light and Hyperspectral Imaging for Fruit Maturity Estimation
title_sort multimodal deep learning and visible-light and hyperspectral imaging for fruit maturity estimation
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7916978/
https://www.ncbi.nlm.nih.gov/pubmed/33670232
http://dx.doi.org/10.3390/s21041288
work_keys_str_mv AT garillosmanliguezcinmayiia multimodaldeeplearningandvisiblelightandhyperspectralimagingforfruitmaturityestimation
AT chiangjohny multimodaldeeplearningandvisiblelightandhyperspectralimagingforfruitmaturityestimation