Cargando…

Color Image Generation from Range and Reflection Data of LiDAR

Recently, it has been reported that a camera-captured-like color image can be generated from the reflection data of 3D light detection and ranging (LiDAR). In this paper, we present that the color image can also be generated from the range data of LiDAR. We propose deep learning networks that genera...

Descripción completa

Detalles Bibliográficos
Autores principales: Kim, Hyun-Koo, Yoo, Kook-Yeol, Jung, Ho-Youl
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2020
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7570707/
https://www.ncbi.nlm.nih.gov/pubmed/32967317
http://dx.doi.org/10.3390/s20185414
_version_ 1783597009313202176
author Kim, Hyun-Koo
Yoo, Kook-Yeol
Jung, Ho-Youl
author_facet Kim, Hyun-Koo
Yoo, Kook-Yeol
Jung, Ho-Youl
author_sort Kim, Hyun-Koo
collection PubMed
description Recently, it has been reported that a camera-captured-like color image can be generated from the reflection data of 3D light detection and ranging (LiDAR). In this paper, we present that the color image can also be generated from the range data of LiDAR. We propose deep learning networks that generate color images by fusing reflection and range data from LiDAR point clouds. In the proposed networks, the two datasets are fused in three ways—early, mid, and last fusion techniques. The baseline network is the encoder-decoder structured fully convolution network (ED-FCN). The image generation performances were evaluated according to source types, including reflection data-only, range data-only, and fusion of the two datasets. The well-known KITTI evaluation data were used for training and verification. The simulation results showed that the proposed last fusion method yields improvements of 0.53 dB, 0.49 dB, and 0.02 in gray-scale peak signal-to-noise ratio (PSNR), color-scale PSNR, and structural similarity index measure (SSIM), respectively, over the conventional reflection-based ED-FCN. Besides, the last fusion method can be applied to real-time applications with an average processing time of 13.56 ms per frame. The methodology presented in this paper would be a powerful tool for generating data from two or more heterogeneous sources.
format Online
Article
Text
id pubmed-7570707
institution National Center for Biotechnology Information
language English
publishDate 2020
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-75707072020-10-28 Color Image Generation from Range and Reflection Data of LiDAR Kim, Hyun-Koo Yoo, Kook-Yeol Jung, Ho-Youl Sensors (Basel) Article Recently, it has been reported that a camera-captured-like color image can be generated from the reflection data of 3D light detection and ranging (LiDAR). In this paper, we present that the color image can also be generated from the range data of LiDAR. We propose deep learning networks that generate color images by fusing reflection and range data from LiDAR point clouds. In the proposed networks, the two datasets are fused in three ways—early, mid, and last fusion techniques. The baseline network is the encoder-decoder structured fully convolution network (ED-FCN). The image generation performances were evaluated according to source types, including reflection data-only, range data-only, and fusion of the two datasets. The well-known KITTI evaluation data were used for training and verification. The simulation results showed that the proposed last fusion method yields improvements of 0.53 dB, 0.49 dB, and 0.02 in gray-scale peak signal-to-noise ratio (PSNR), color-scale PSNR, and structural similarity index measure (SSIM), respectively, over the conventional reflection-based ED-FCN. Besides, the last fusion method can be applied to real-time applications with an average processing time of 13.56 ms per frame. The methodology presented in this paper would be a powerful tool for generating data from two or more heterogeneous sources. MDPI 2020-09-21 /pmc/articles/PMC7570707/ /pubmed/32967317 http://dx.doi.org/10.3390/s20185414 Text en © 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Kim, Hyun-Koo
Yoo, Kook-Yeol
Jung, Ho-Youl
Color Image Generation from Range and Reflection Data of LiDAR
title Color Image Generation from Range and Reflection Data of LiDAR
title_full Color Image Generation from Range and Reflection Data of LiDAR
title_fullStr Color Image Generation from Range and Reflection Data of LiDAR
title_full_unstemmed Color Image Generation from Range and Reflection Data of LiDAR
title_short Color Image Generation from Range and Reflection Data of LiDAR
title_sort color image generation from range and reflection data of lidar
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7570707/
https://www.ncbi.nlm.nih.gov/pubmed/32967317
http://dx.doi.org/10.3390/s20185414
work_keys_str_mv AT kimhyunkoo colorimagegenerationfromrangeandreflectiondataoflidar
AT yookookyeol colorimagegenerationfromrangeandreflectiondataoflidar
AT junghoyoul colorimagegenerationfromrangeandreflectiondataoflidar