Cargando…

Reflectance Measurement Method Based on Sensor Fusion of Frame-Based Hyperspectral Imager and Time-of-Flight Depth Camera

Hyperspectral imaging and distance data have previously been used in aerial, forestry, agricultural, and medical imaging applications. Extracting meaningful information from a combination of different imaging modalities is difficult, as the image sensor fusion requires knowing the optical properties...

Descripción completa

Detalles Bibliográficos
Autores principales: Rahkonen, Samuli, Lind, Leevi, Raita-Hakola, Anna-Maria, Kiiskinen, Sampsa, Pölönen, Ilkka
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9696373/
https://www.ncbi.nlm.nih.gov/pubmed/36433268
http://dx.doi.org/10.3390/s22228668
_version_ 1784838301322051584
author Rahkonen, Samuli
Lind, Leevi
Raita-Hakola, Anna-Maria
Kiiskinen, Sampsa
Pölönen, Ilkka
author_facet Rahkonen, Samuli
Lind, Leevi
Raita-Hakola, Anna-Maria
Kiiskinen, Sampsa
Pölönen, Ilkka
author_sort Rahkonen, Samuli
collection PubMed
description Hyperspectral imaging and distance data have previously been used in aerial, forestry, agricultural, and medical imaging applications. Extracting meaningful information from a combination of different imaging modalities is difficult, as the image sensor fusion requires knowing the optical properties of the sensors, selecting the right optics and finding the sensors’ mutual reference frame through calibration. In this research we demonstrate a method for fusing data from Fabry–Perot interferometer hyperspectral camera and a Kinect V2 time-of-flight depth sensing camera. We created an experimental application to demonstrate utilizing the depth augmented hyperspectral data to measure emission angle dependent reflectance from a multi-view inferred point cloud. We determined the intrinsic and extrinsic camera parameters through calibration, used global and local registration algorithms to combine point clouds from different viewpoints, created a dense point cloud and determined the angle dependent reflectances from it. The method could successfully combine the 3D point cloud data and hyperspectral data from different viewpoints of a reference colorchecker board. The point cloud registrations gained [Formula: see text] – [Formula: see text] fitness for inlier point correspondences and RMSE was approx. 2, which refers a quite reliable registration result. The RMSE of the measured reflectances between the front view and side views of the targets varied between [Formula: see text] and [Formula: see text] on average and the spectral angle between [Formula: see text] and [Formula: see text] degrees. The results suggest that changing emission angle has very small effect on the surface reflectance intensity and spectrum shapes, which was expected with the used colorchecker.
format Online
Article
Text
id pubmed-9696373
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-96963732022-11-26 Reflectance Measurement Method Based on Sensor Fusion of Frame-Based Hyperspectral Imager and Time-of-Flight Depth Camera Rahkonen, Samuli Lind, Leevi Raita-Hakola, Anna-Maria Kiiskinen, Sampsa Pölönen, Ilkka Sensors (Basel) Article Hyperspectral imaging and distance data have previously been used in aerial, forestry, agricultural, and medical imaging applications. Extracting meaningful information from a combination of different imaging modalities is difficult, as the image sensor fusion requires knowing the optical properties of the sensors, selecting the right optics and finding the sensors’ mutual reference frame through calibration. In this research we demonstrate a method for fusing data from Fabry–Perot interferometer hyperspectral camera and a Kinect V2 time-of-flight depth sensing camera. We created an experimental application to demonstrate utilizing the depth augmented hyperspectral data to measure emission angle dependent reflectance from a multi-view inferred point cloud. We determined the intrinsic and extrinsic camera parameters through calibration, used global and local registration algorithms to combine point clouds from different viewpoints, created a dense point cloud and determined the angle dependent reflectances from it. The method could successfully combine the 3D point cloud data and hyperspectral data from different viewpoints of a reference colorchecker board. The point cloud registrations gained [Formula: see text] – [Formula: see text] fitness for inlier point correspondences and RMSE was approx. 2, which refers a quite reliable registration result. The RMSE of the measured reflectances between the front view and side views of the targets varied between [Formula: see text] and [Formula: see text] on average and the spectral angle between [Formula: see text] and [Formula: see text] degrees. The results suggest that changing emission angle has very small effect on the surface reflectance intensity and spectrum shapes, which was expected with the used colorchecker. MDPI 2022-11-10 /pmc/articles/PMC9696373/ /pubmed/36433268 http://dx.doi.org/10.3390/s22228668 Text en © 2022 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Rahkonen, Samuli
Lind, Leevi
Raita-Hakola, Anna-Maria
Kiiskinen, Sampsa
Pölönen, Ilkka
Reflectance Measurement Method Based on Sensor Fusion of Frame-Based Hyperspectral Imager and Time-of-Flight Depth Camera
title Reflectance Measurement Method Based on Sensor Fusion of Frame-Based Hyperspectral Imager and Time-of-Flight Depth Camera
title_full Reflectance Measurement Method Based on Sensor Fusion of Frame-Based Hyperspectral Imager and Time-of-Flight Depth Camera
title_fullStr Reflectance Measurement Method Based on Sensor Fusion of Frame-Based Hyperspectral Imager and Time-of-Flight Depth Camera
title_full_unstemmed Reflectance Measurement Method Based on Sensor Fusion of Frame-Based Hyperspectral Imager and Time-of-Flight Depth Camera
title_short Reflectance Measurement Method Based on Sensor Fusion of Frame-Based Hyperspectral Imager and Time-of-Flight Depth Camera
title_sort reflectance measurement method based on sensor fusion of frame-based hyperspectral imager and time-of-flight depth camera
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9696373/
https://www.ncbi.nlm.nih.gov/pubmed/36433268
http://dx.doi.org/10.3390/s22228668
work_keys_str_mv AT rahkonensamuli reflectancemeasurementmethodbasedonsensorfusionofframebasedhyperspectralimagerandtimeofflightdepthcamera
AT lindleevi reflectancemeasurementmethodbasedonsensorfusionofframebasedhyperspectralimagerandtimeofflightdepthcamera
AT raitahakolaannamaria reflectancemeasurementmethodbasedonsensorfusionofframebasedhyperspectralimagerandtimeofflightdepthcamera
AT kiiskinensampsa reflectancemeasurementmethodbasedonsensorfusionofframebasedhyperspectralimagerandtimeofflightdepthcamera
AT polonenilkka reflectancemeasurementmethodbasedonsensorfusionofframebasedhyperspectralimagerandtimeofflightdepthcamera