Cargando…

Analysis of Depth Cameras for Proximal Sensing of Grapes

This work investigates the performance of five depth cameras in relation to their potential for grape yield estimation. The technologies used by these cameras include structured light (Kinect V1), active infrared stereoscopy (RealSense D415), time of flight (Kinect V2 and Kinect Azure), and LiDAR (I...

Descripción completa

Detalles Bibliográficos
Autores principales: Parr, Baden, Legg, Mathew, Alam, Fakhrul
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9185296/
https://www.ncbi.nlm.nih.gov/pubmed/35684799
http://dx.doi.org/10.3390/s22114179
_version_ 1784724689500766208
author Parr, Baden
Legg, Mathew
Alam, Fakhrul
author_facet Parr, Baden
Legg, Mathew
Alam, Fakhrul
author_sort Parr, Baden
collection PubMed
description This work investigates the performance of five depth cameras in relation to their potential for grape yield estimation. The technologies used by these cameras include structured light (Kinect V1), active infrared stereoscopy (RealSense D415), time of flight (Kinect V2 and Kinect Azure), and LiDAR (Intel L515). To evaluate their suitability for grape yield estimation, a range of factors were investigated including their performance in and out of direct sunlight, their ability to accurately measure the shape of the grapes, and their potential to facilitate counting and sizing of individual berries. The depth cameras’ performance was benchmarked using high-resolution photogrammetry scans. All the cameras except the Kinect V1 were able to operate in direct sunlight. Indoors, the RealSense D415 camera provided the most accurate depth scans of grape bunches, with a 2 mm average depth error relative to photogrammetric scans. However, its performance was reduced in direct sunlight. The time of flight and LiDAR cameras provided depth scans of grapes that had about an 8 mm depth bias. Furthermore, the individual berries manifested in the scans as pointed shape distortions. This led to an underestimation of berry sizes when applying the RANSAC sphere fitting but may help with the detection of individual berries with more advanced algorithms. Applying an opaque coating to the surface of the grapes reduced the observed distance bias and shape distortion. This indicated that these are likely caused by the cameras’ transmitted light experiencing diffused scattering within the grapes. More work is needed to investigate if this distortion can be used for enhanced measurement of grape properties such as ripeness and berry size.
format Online
Article
Text
id pubmed-9185296
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-91852962022-06-11 Analysis of Depth Cameras for Proximal Sensing of Grapes Parr, Baden Legg, Mathew Alam, Fakhrul Sensors (Basel) Article This work investigates the performance of five depth cameras in relation to their potential for grape yield estimation. The technologies used by these cameras include structured light (Kinect V1), active infrared stereoscopy (RealSense D415), time of flight (Kinect V2 and Kinect Azure), and LiDAR (Intel L515). To evaluate their suitability for grape yield estimation, a range of factors were investigated including their performance in and out of direct sunlight, their ability to accurately measure the shape of the grapes, and their potential to facilitate counting and sizing of individual berries. The depth cameras’ performance was benchmarked using high-resolution photogrammetry scans. All the cameras except the Kinect V1 were able to operate in direct sunlight. Indoors, the RealSense D415 camera provided the most accurate depth scans of grape bunches, with a 2 mm average depth error relative to photogrammetric scans. However, its performance was reduced in direct sunlight. The time of flight and LiDAR cameras provided depth scans of grapes that had about an 8 mm depth bias. Furthermore, the individual berries manifested in the scans as pointed shape distortions. This led to an underestimation of berry sizes when applying the RANSAC sphere fitting but may help with the detection of individual berries with more advanced algorithms. Applying an opaque coating to the surface of the grapes reduced the observed distance bias and shape distortion. This indicated that these are likely caused by the cameras’ transmitted light experiencing diffused scattering within the grapes. More work is needed to investigate if this distortion can be used for enhanced measurement of grape properties such as ripeness and berry size. MDPI 2022-05-31 /pmc/articles/PMC9185296/ /pubmed/35684799 http://dx.doi.org/10.3390/s22114179 Text en © 2022 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Parr, Baden
Legg, Mathew
Alam, Fakhrul
Analysis of Depth Cameras for Proximal Sensing of Grapes
title Analysis of Depth Cameras for Proximal Sensing of Grapes
title_full Analysis of Depth Cameras for Proximal Sensing of Grapes
title_fullStr Analysis of Depth Cameras for Proximal Sensing of Grapes
title_full_unstemmed Analysis of Depth Cameras for Proximal Sensing of Grapes
title_short Analysis of Depth Cameras for Proximal Sensing of Grapes
title_sort analysis of depth cameras for proximal sensing of grapes
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9185296/
https://www.ncbi.nlm.nih.gov/pubmed/35684799
http://dx.doi.org/10.3390/s22114179
work_keys_str_mv AT parrbaden analysisofdepthcamerasforproximalsensingofgrapes
AT leggmathew analysisofdepthcamerasforproximalsensingofgrapes
AT alamfakhrul analysisofdepthcamerasforproximalsensingofgrapes