Cargando…
Estimation of the Closest In-Path Vehicle by Low-Channel LiDAR and Camera Sensor Fusion for Autonomous Vehicles
In autonomous driving, using a variety of sensors to recognize preceding vehicles at middle and long distances is helpful for improving driving performance and developing various functions. However, if only LiDAR or cameras are used in the recognition stage, it is difficult to obtain the necessary d...
Autores principales: | , , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
MDPI
2021
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8125378/ https://www.ncbi.nlm.nih.gov/pubmed/33946282 http://dx.doi.org/10.3390/s21093124 |
_version_ | 1783693484195053568 |
---|---|
author | Bae, Hyunjin Lee, Gu Yang, Jaeseung Shin, Gwanjun Choi, Gyeungho Lim, Yongseob |
author_facet | Bae, Hyunjin Lee, Gu Yang, Jaeseung Shin, Gwanjun Choi, Gyeungho Lim, Yongseob |
author_sort | Bae, Hyunjin |
collection | PubMed |
description | In autonomous driving, using a variety of sensors to recognize preceding vehicles at middle and long distances is helpful for improving driving performance and developing various functions. However, if only LiDAR or cameras are used in the recognition stage, it is difficult to obtain the necessary data due to the limitations of each sensor. In this paper, we proposed a method of converting the vision-tracked data into bird’s eye-view (BEV) coordinates using an equation that projects LiDAR points onto an image and a method of fusion between LiDAR and vision-tracked data. Thus, the proposed method was effective through the results of detecting the closest in-path vehicle (CIPV) in various situations. In addition, even when experimenting with the EuroNCAP autonomous emergency braking (AEB) test protocol using the result of fusion, AEB performance was improved through improved cognitive performance than when using only LiDAR. In the experimental results, the performance of the proposed method was proven through actual vehicle tests in various scenarios. Consequently, it was convincing that the proposed sensor fusion method significantly improved the adaptive cruise control (ACC) function in autonomous maneuvering. We expect that this improvement in perception performance will contribute to improving the overall stability of ACC. |
format | Online Article Text |
id | pubmed-8125378 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2021 |
publisher | MDPI |
record_format | MEDLINE/PubMed |
spelling | pubmed-81253782021-05-17 Estimation of the Closest In-Path Vehicle by Low-Channel LiDAR and Camera Sensor Fusion for Autonomous Vehicles Bae, Hyunjin Lee, Gu Yang, Jaeseung Shin, Gwanjun Choi, Gyeungho Lim, Yongseob Sensors (Basel) Article In autonomous driving, using a variety of sensors to recognize preceding vehicles at middle and long distances is helpful for improving driving performance and developing various functions. However, if only LiDAR or cameras are used in the recognition stage, it is difficult to obtain the necessary data due to the limitations of each sensor. In this paper, we proposed a method of converting the vision-tracked data into bird’s eye-view (BEV) coordinates using an equation that projects LiDAR points onto an image and a method of fusion between LiDAR and vision-tracked data. Thus, the proposed method was effective through the results of detecting the closest in-path vehicle (CIPV) in various situations. In addition, even when experimenting with the EuroNCAP autonomous emergency braking (AEB) test protocol using the result of fusion, AEB performance was improved through improved cognitive performance than when using only LiDAR. In the experimental results, the performance of the proposed method was proven through actual vehicle tests in various scenarios. Consequently, it was convincing that the proposed sensor fusion method significantly improved the adaptive cruise control (ACC) function in autonomous maneuvering. We expect that this improvement in perception performance will contribute to improving the overall stability of ACC. MDPI 2021-04-30 /pmc/articles/PMC8125378/ /pubmed/33946282 http://dx.doi.org/10.3390/s21093124 Text en © 2021 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). |
spellingShingle | Article Bae, Hyunjin Lee, Gu Yang, Jaeseung Shin, Gwanjun Choi, Gyeungho Lim, Yongseob Estimation of the Closest In-Path Vehicle by Low-Channel LiDAR and Camera Sensor Fusion for Autonomous Vehicles |
title | Estimation of the Closest In-Path Vehicle by Low-Channel LiDAR and Camera Sensor Fusion for Autonomous Vehicles |
title_full | Estimation of the Closest In-Path Vehicle by Low-Channel LiDAR and Camera Sensor Fusion for Autonomous Vehicles |
title_fullStr | Estimation of the Closest In-Path Vehicle by Low-Channel LiDAR and Camera Sensor Fusion for Autonomous Vehicles |
title_full_unstemmed | Estimation of the Closest In-Path Vehicle by Low-Channel LiDAR and Camera Sensor Fusion for Autonomous Vehicles |
title_short | Estimation of the Closest In-Path Vehicle by Low-Channel LiDAR and Camera Sensor Fusion for Autonomous Vehicles |
title_sort | estimation of the closest in-path vehicle by low-channel lidar and camera sensor fusion for autonomous vehicles |
topic | Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8125378/ https://www.ncbi.nlm.nih.gov/pubmed/33946282 http://dx.doi.org/10.3390/s21093124 |
work_keys_str_mv | AT baehyunjin estimationoftheclosestinpathvehiclebylowchannellidarandcamerasensorfusionforautonomousvehicles AT leegu estimationoftheclosestinpathvehiclebylowchannellidarandcamerasensorfusionforautonomousvehicles AT yangjaeseung estimationoftheclosestinpathvehiclebylowchannellidarandcamerasensorfusionforautonomousvehicles AT shingwanjun estimationoftheclosestinpathvehiclebylowchannellidarandcamerasensorfusionforautonomousvehicles AT choigyeungho estimationoftheclosestinpathvehiclebylowchannellidarandcamerasensorfusionforautonomousvehicles AT limyongseob estimationoftheclosestinpathvehiclebylowchannellidarandcamerasensorfusionforautonomousvehicles |