Cargando…
Depth-Camera-Aided Inertial Navigation Utilizing Directional Constraints
This paper presents a practical yet effective solution for integrating an RGB-D camera and an inertial sensor to handle the depth dropouts that frequently happen in outdoor environments, due to the short detection range and sunlight interference. In depth drop conditions, only the partial 5-degrees-...
Autores principales: | , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
MDPI
2021
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8434182/ https://www.ncbi.nlm.nih.gov/pubmed/34502806 http://dx.doi.org/10.3390/s21175913 |
_version_ | 1783751537694081024 |
---|---|
author | Qayyum, Usman Kim, Jonghyuk |
author_facet | Qayyum, Usman Kim, Jonghyuk |
author_sort | Qayyum, Usman |
collection | PubMed |
description | This paper presents a practical yet effective solution for integrating an RGB-D camera and an inertial sensor to handle the depth dropouts that frequently happen in outdoor environments, due to the short detection range and sunlight interference. In depth drop conditions, only the partial 5-degrees-of-freedom pose information (attitude and position with an unknown scale) is available from the RGB-D sensor. To enable continuous fusion with the inertial solutions, the scale ambiguous position is cast into a directional constraint of the vehicle motion, which is, in essence, an epipolar constraint in multi-view geometry. Unlike other visual navigation approaches, this can effectively reduce the drift in the inertial solutions without delay or under small parallax motion. If a depth image is available, a window-based feature map is maintained to compute the RGB-D odometry, which is then fused with inertial outputs in an extended Kalman filter framework. Flight results from the indoor and outdoor environments, as well as public datasets, demonstrate the improved navigation performance of the proposed approach. |
format | Online Article Text |
id | pubmed-8434182 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2021 |
publisher | MDPI |
record_format | MEDLINE/PubMed |
spelling | pubmed-84341822021-09-12 Depth-Camera-Aided Inertial Navigation Utilizing Directional Constraints Qayyum, Usman Kim, Jonghyuk Sensors (Basel) Article This paper presents a practical yet effective solution for integrating an RGB-D camera and an inertial sensor to handle the depth dropouts that frequently happen in outdoor environments, due to the short detection range and sunlight interference. In depth drop conditions, only the partial 5-degrees-of-freedom pose information (attitude and position with an unknown scale) is available from the RGB-D sensor. To enable continuous fusion with the inertial solutions, the scale ambiguous position is cast into a directional constraint of the vehicle motion, which is, in essence, an epipolar constraint in multi-view geometry. Unlike other visual navigation approaches, this can effectively reduce the drift in the inertial solutions without delay or under small parallax motion. If a depth image is available, a window-based feature map is maintained to compute the RGB-D odometry, which is then fused with inertial outputs in an extended Kalman filter framework. Flight results from the indoor and outdoor environments, as well as public datasets, demonstrate the improved navigation performance of the proposed approach. MDPI 2021-09-02 /pmc/articles/PMC8434182/ /pubmed/34502806 http://dx.doi.org/10.3390/s21175913 Text en © 2021 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). |
spellingShingle | Article Qayyum, Usman Kim, Jonghyuk Depth-Camera-Aided Inertial Navigation Utilizing Directional Constraints |
title | Depth-Camera-Aided Inertial Navigation Utilizing Directional Constraints |
title_full | Depth-Camera-Aided Inertial Navigation Utilizing Directional Constraints |
title_fullStr | Depth-Camera-Aided Inertial Navigation Utilizing Directional Constraints |
title_full_unstemmed | Depth-Camera-Aided Inertial Navigation Utilizing Directional Constraints |
title_short | Depth-Camera-Aided Inertial Navigation Utilizing Directional Constraints |
title_sort | depth-camera-aided inertial navigation utilizing directional constraints |
topic | Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8434182/ https://www.ncbi.nlm.nih.gov/pubmed/34502806 http://dx.doi.org/10.3390/s21175913 |
work_keys_str_mv | AT qayyumusman depthcameraaidedinertialnavigationutilizingdirectionalconstraints AT kimjonghyuk depthcameraaidedinertialnavigationutilizingdirectionalconstraints |