Cargando…
Real-Time Monocular Vision System for UAV Autonomous Landing in Outdoor Low-Illumination Environments
Landing an unmanned aerial vehicle (UAV) autonomously and safely is a challenging task. Although the existing approaches have resolved the problem of precise landing by identifying a specific landing marker using the UAV’s onboard vision system, the vast majority of these works are conducted in eith...
Autores principales: | , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
MDPI
2021
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8471562/ https://www.ncbi.nlm.nih.gov/pubmed/34577433 http://dx.doi.org/10.3390/s21186226 |
_version_ | 1784574498864889856 |
---|---|
author | Lin, Shanggang Jin, Lianwen Chen, Ziwei |
author_facet | Lin, Shanggang Jin, Lianwen Chen, Ziwei |
author_sort | Lin, Shanggang |
collection | PubMed |
description | Landing an unmanned aerial vehicle (UAV) autonomously and safely is a challenging task. Although the existing approaches have resolved the problem of precise landing by identifying a specific landing marker using the UAV’s onboard vision system, the vast majority of these works are conducted in either daytime or well-illuminated laboratory environments. In contrast, very few researchers have investigated the possibility of landing in low-illumination conditions by employing various active light sources to lighten the markers. In this paper, a novel vision system design is proposed to tackle UAV landing in outdoor extreme low-illumination environments without the need to apply an active light source to the marker. We use a model-based enhancement scheme to improve the quality and brightness of the onboard captured images, then present a hierarchical-based method consisting of a decision tree with an associated light-weight convolutional neural network (CNN) for coarse-to-fine landing marker localization, where the key information of the marker is extracted and reserved for post-processing, such as pose estimation and landing control. Extensive evaluations have been conducted to demonstrate the robustness, accuracy, and real-time performance of the proposed vision system. Field experiments across a variety of outdoor nighttime scenarios with an average luminance of 5 lx at the marker locations have proven the feasibility and practicability of the system. |
format | Online Article Text |
id | pubmed-8471562 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2021 |
publisher | MDPI |
record_format | MEDLINE/PubMed |
spelling | pubmed-84715622021-09-28 Real-Time Monocular Vision System for UAV Autonomous Landing in Outdoor Low-Illumination Environments Lin, Shanggang Jin, Lianwen Chen, Ziwei Sensors (Basel) Article Landing an unmanned aerial vehicle (UAV) autonomously and safely is a challenging task. Although the existing approaches have resolved the problem of precise landing by identifying a specific landing marker using the UAV’s onboard vision system, the vast majority of these works are conducted in either daytime or well-illuminated laboratory environments. In contrast, very few researchers have investigated the possibility of landing in low-illumination conditions by employing various active light sources to lighten the markers. In this paper, a novel vision system design is proposed to tackle UAV landing in outdoor extreme low-illumination environments without the need to apply an active light source to the marker. We use a model-based enhancement scheme to improve the quality and brightness of the onboard captured images, then present a hierarchical-based method consisting of a decision tree with an associated light-weight convolutional neural network (CNN) for coarse-to-fine landing marker localization, where the key information of the marker is extracted and reserved for post-processing, such as pose estimation and landing control. Extensive evaluations have been conducted to demonstrate the robustness, accuracy, and real-time performance of the proposed vision system. Field experiments across a variety of outdoor nighttime scenarios with an average luminance of 5 lx at the marker locations have proven the feasibility and practicability of the system. MDPI 2021-09-16 /pmc/articles/PMC8471562/ /pubmed/34577433 http://dx.doi.org/10.3390/s21186226 Text en © 2021 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). |
spellingShingle | Article Lin, Shanggang Jin, Lianwen Chen, Ziwei Real-Time Monocular Vision System for UAV Autonomous Landing in Outdoor Low-Illumination Environments |
title | Real-Time Monocular Vision System for UAV Autonomous Landing in Outdoor Low-Illumination Environments |
title_full | Real-Time Monocular Vision System for UAV Autonomous Landing in Outdoor Low-Illumination Environments |
title_fullStr | Real-Time Monocular Vision System for UAV Autonomous Landing in Outdoor Low-Illumination Environments |
title_full_unstemmed | Real-Time Monocular Vision System for UAV Autonomous Landing in Outdoor Low-Illumination Environments |
title_short | Real-Time Monocular Vision System for UAV Autonomous Landing in Outdoor Low-Illumination Environments |
title_sort | real-time monocular vision system for uav autonomous landing in outdoor low-illumination environments |
topic | Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8471562/ https://www.ncbi.nlm.nih.gov/pubmed/34577433 http://dx.doi.org/10.3390/s21186226 |
work_keys_str_mv | AT linshanggang realtimemonocularvisionsystemforuavautonomouslandinginoutdoorlowilluminationenvironments AT jinlianwen realtimemonocularvisionsystemforuavautonomouslandinginoutdoorlowilluminationenvironments AT chenziwei realtimemonocularvisionsystemforuavautonomouslandinginoutdoorlowilluminationenvironments |