Cargando…

Robust Fusion of LiDAR and Wide-Angle Camera Data for Autonomous Mobile Robots

Autonomous robots that assist humans in day to day living tasks are becoming increasingly popular. Autonomous mobile robots operate by sensing and perceiving their surrounding environment to make accurate driving decisions. A combination of several different sensors such as LiDAR, radar, ultrasound...

Descripción completa

Detalles Bibliográficos
Autores principales: De Silva, Varuna, Roche, Jamie, Kondoz, Ahmet
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2018
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6112019/
https://www.ncbi.nlm.nih.gov/pubmed/30127253
http://dx.doi.org/10.3390/s18082730
_version_ 1783350773166374912
author De Silva, Varuna
Roche, Jamie
Kondoz, Ahmet
author_facet De Silva, Varuna
Roche, Jamie
Kondoz, Ahmet
author_sort De Silva, Varuna
collection PubMed
description Autonomous robots that assist humans in day to day living tasks are becoming increasingly popular. Autonomous mobile robots operate by sensing and perceiving their surrounding environment to make accurate driving decisions. A combination of several different sensors such as LiDAR, radar, ultrasound sensors and cameras are utilized to sense the surrounding environment of autonomous vehicles. These heterogeneous sensors simultaneously capture various physical attributes of the environment. Such multimodality and redundancy of sensing need to be positively utilized for reliable and consistent perception of the environment through sensor data fusion. However, these multimodal sensor data streams are different from each other in many ways, such as temporal and spatial resolution, data format, and geometric alignment. For the subsequent perception algorithms to utilize the diversity offered by multimodal sensing, the data streams need to be spatially, geometrically and temporally aligned with each other. In this paper, we address the problem of fusing the outputs of a Light Detection and Ranging (LiDAR) scanner and a wide-angle monocular image sensor for free space detection. The outputs of LiDAR scanner and the image sensor are of different spatial resolutions and need to be aligned with each other. A geometrical model is used to spatially align the two sensor outputs, followed by a Gaussian Process (GP) regression-based resolution matching algorithm to interpolate the missing data with quantifiable uncertainty. The results indicate that the proposed sensor data fusion framework significantly aids the subsequent perception steps, as illustrated by the performance improvement of a uncertainty aware free space detection algorithm.
format Online
Article
Text
id pubmed-6112019
institution National Center for Biotechnology Information
language English
publishDate 2018
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-61120192018-08-30 Robust Fusion of LiDAR and Wide-Angle Camera Data for Autonomous Mobile Robots De Silva, Varuna Roche, Jamie Kondoz, Ahmet Sensors (Basel) Article Autonomous robots that assist humans in day to day living tasks are becoming increasingly popular. Autonomous mobile robots operate by sensing and perceiving their surrounding environment to make accurate driving decisions. A combination of several different sensors such as LiDAR, radar, ultrasound sensors and cameras are utilized to sense the surrounding environment of autonomous vehicles. These heterogeneous sensors simultaneously capture various physical attributes of the environment. Such multimodality and redundancy of sensing need to be positively utilized for reliable and consistent perception of the environment through sensor data fusion. However, these multimodal sensor data streams are different from each other in many ways, such as temporal and spatial resolution, data format, and geometric alignment. For the subsequent perception algorithms to utilize the diversity offered by multimodal sensing, the data streams need to be spatially, geometrically and temporally aligned with each other. In this paper, we address the problem of fusing the outputs of a Light Detection and Ranging (LiDAR) scanner and a wide-angle monocular image sensor for free space detection. The outputs of LiDAR scanner and the image sensor are of different spatial resolutions and need to be aligned with each other. A geometrical model is used to spatially align the two sensor outputs, followed by a Gaussian Process (GP) regression-based resolution matching algorithm to interpolate the missing data with quantifiable uncertainty. The results indicate that the proposed sensor data fusion framework significantly aids the subsequent perception steps, as illustrated by the performance improvement of a uncertainty aware free space detection algorithm. MDPI 2018-08-20 /pmc/articles/PMC6112019/ /pubmed/30127253 http://dx.doi.org/10.3390/s18082730 Text en © 2018 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
De Silva, Varuna
Roche, Jamie
Kondoz, Ahmet
Robust Fusion of LiDAR and Wide-Angle Camera Data for Autonomous Mobile Robots
title Robust Fusion of LiDAR and Wide-Angle Camera Data for Autonomous Mobile Robots
title_full Robust Fusion of LiDAR and Wide-Angle Camera Data for Autonomous Mobile Robots
title_fullStr Robust Fusion of LiDAR and Wide-Angle Camera Data for Autonomous Mobile Robots
title_full_unstemmed Robust Fusion of LiDAR and Wide-Angle Camera Data for Autonomous Mobile Robots
title_short Robust Fusion of LiDAR and Wide-Angle Camera Data for Autonomous Mobile Robots
title_sort robust fusion of lidar and wide-angle camera data for autonomous mobile robots
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6112019/
https://www.ncbi.nlm.nih.gov/pubmed/30127253
http://dx.doi.org/10.3390/s18082730
work_keys_str_mv AT desilvavaruna robustfusionoflidarandwideanglecameradataforautonomousmobilerobots
AT rochejamie robustfusionoflidarandwideanglecameradataforautonomousmobilerobots
AT kondozahmet robustfusionoflidarandwideanglecameradataforautonomousmobilerobots