Cargando…

Unifying Obstacle Detection, Recognition, and Fusion Based on the Polarization Color Stereo Camera and LiDAR for the ADAS

The perception module plays an important role in vehicles equipped with advanced driver-assistance systems (ADAS). This paper presents a multi-sensor data fusion system based on the polarization color stereo camera and the forward-looking light detection and ranging (LiDAR), which achieves the multi...

Descripción completa

Detalles Bibliográficos
Autores principales: Long, Ningbo, Yan, Han, Wang, Liqiang, Li, Haifeng, Yang, Qing
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9003213/
https://www.ncbi.nlm.nih.gov/pubmed/35408068
http://dx.doi.org/10.3390/s22072453
_version_ 1784686078531207168
author Long, Ningbo
Yan, Han
Wang, Liqiang
Li, Haifeng
Yang, Qing
author_facet Long, Ningbo
Yan, Han
Wang, Liqiang
Li, Haifeng
Yang, Qing
author_sort Long, Ningbo
collection PubMed
description The perception module plays an important role in vehicles equipped with advanced driver-assistance systems (ADAS). This paper presents a multi-sensor data fusion system based on the polarization color stereo camera and the forward-looking light detection and ranging (LiDAR), which achieves the multiple target detection, recognition, and data fusion. The You Only Look Once v4 (YOLOv4) network is utilized to achieve object detection and recognition on the color images. The depth images are obtained from the rectified left and right images based on the principle of the epipolar constraints, then the obstacles are detected from the depth images using the MeanShift algorithm. The pixel-level polarization images are extracted from the raw polarization-grey images, then the water hazards are detected successfully. The PointPillars network is employed to detect the objects from the point cloud. The calibration and synchronization between the sensors are accomplished. The experiment results show that the data fusion enriches the detection results, provides high-dimensional perceptual information and extends the effective detection range. Meanwhile, the detection results are stable under diverse range and illumination conditions.
format Online
Article
Text
id pubmed-9003213
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-90032132022-04-13 Unifying Obstacle Detection, Recognition, and Fusion Based on the Polarization Color Stereo Camera and LiDAR for the ADAS Long, Ningbo Yan, Han Wang, Liqiang Li, Haifeng Yang, Qing Sensors (Basel) Article The perception module plays an important role in vehicles equipped with advanced driver-assistance systems (ADAS). This paper presents a multi-sensor data fusion system based on the polarization color stereo camera and the forward-looking light detection and ranging (LiDAR), which achieves the multiple target detection, recognition, and data fusion. The You Only Look Once v4 (YOLOv4) network is utilized to achieve object detection and recognition on the color images. The depth images are obtained from the rectified left and right images based on the principle of the epipolar constraints, then the obstacles are detected from the depth images using the MeanShift algorithm. The pixel-level polarization images are extracted from the raw polarization-grey images, then the water hazards are detected successfully. The PointPillars network is employed to detect the objects from the point cloud. The calibration and synchronization between the sensors are accomplished. The experiment results show that the data fusion enriches the detection results, provides high-dimensional perceptual information and extends the effective detection range. Meanwhile, the detection results are stable under diverse range and illumination conditions. MDPI 2022-03-23 /pmc/articles/PMC9003213/ /pubmed/35408068 http://dx.doi.org/10.3390/s22072453 Text en © 2022 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Long, Ningbo
Yan, Han
Wang, Liqiang
Li, Haifeng
Yang, Qing
Unifying Obstacle Detection, Recognition, and Fusion Based on the Polarization Color Stereo Camera and LiDAR for the ADAS
title Unifying Obstacle Detection, Recognition, and Fusion Based on the Polarization Color Stereo Camera and LiDAR for the ADAS
title_full Unifying Obstacle Detection, Recognition, and Fusion Based on the Polarization Color Stereo Camera and LiDAR for the ADAS
title_fullStr Unifying Obstacle Detection, Recognition, and Fusion Based on the Polarization Color Stereo Camera and LiDAR for the ADAS
title_full_unstemmed Unifying Obstacle Detection, Recognition, and Fusion Based on the Polarization Color Stereo Camera and LiDAR for the ADAS
title_short Unifying Obstacle Detection, Recognition, and Fusion Based on the Polarization Color Stereo Camera and LiDAR for the ADAS
title_sort unifying obstacle detection, recognition, and fusion based on the polarization color stereo camera and lidar for the adas
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9003213/
https://www.ncbi.nlm.nih.gov/pubmed/35408068
http://dx.doi.org/10.3390/s22072453
work_keys_str_mv AT longningbo unifyingobstacledetectionrecognitionandfusionbasedonthepolarizationcolorstereocameraandlidarfortheadas
AT yanhan unifyingobstacledetectionrecognitionandfusionbasedonthepolarizationcolorstereocameraandlidarfortheadas
AT wangliqiang unifyingobstacledetectionrecognitionandfusionbasedonthepolarizationcolorstereocameraandlidarfortheadas
AT lihaifeng unifyingobstacledetectionrecognitionandfusionbasedonthepolarizationcolorstereocameraandlidarfortheadas
AT yangqing unifyingobstacledetectionrecognitionandfusionbasedonthepolarizationcolorstereocameraandlidarfortheadas