Cargando…

Robust Tightly Coupled Pose Measurement Based on Multi-Sensor Fusion in Mobile Robot System

Currently, simultaneous localization and mapping (SLAM) is one of the main research topics in the robotics field. Visual-inertia SLAM, which consists of a camera and an inertial measurement unit (IMU), can significantly improve robustness and enable scale weak-visibility, whereas monocular visual SL...

Descripción completa

Detalles Bibliográficos
Autores principales: Peng, Gang, Lu, Zezao, Peng, Jiaxi, He, Dingxin, Li, Xinde, Hu, Bin
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2021
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8402045/
https://www.ncbi.nlm.nih.gov/pubmed/34450964
http://dx.doi.org/10.3390/s21165522
_version_ 1783745694582964224
author Peng, Gang
Lu, Zezao
Peng, Jiaxi
He, Dingxin
Li, Xinde
Hu, Bin
author_facet Peng, Gang
Lu, Zezao
Peng, Jiaxi
He, Dingxin
Li, Xinde
Hu, Bin
author_sort Peng, Gang
collection PubMed
description Currently, simultaneous localization and mapping (SLAM) is one of the main research topics in the robotics field. Visual-inertia SLAM, which consists of a camera and an inertial measurement unit (IMU), can significantly improve robustness and enable scale weak-visibility, whereas monocular visual SLAM is scale-invisible. For ground mobile robots, the introduction of a wheel speed sensor can solve the scale weak-visibility problem and improve robustness under abnormal conditions. In this paper, a multi-sensor fusion SLAM algorithm using monocular vision, inertia, and wheel speed measurements is proposed. The sensor measurements are combined in a tightly coupled manner, and a nonlinear optimization method is used to maximize the posterior probability to solve the optimal state estimation. Loop detection and back-end optimization are added to help reduce or even eliminate the cumulative error of the estimated poses, thus ensuring global consistency of the trajectory and map. The outstanding contribution of this paper is that the wheel odometer pre-integration algorithm, which combines the chassis speed and IMU angular speed, can avoid the repeated integration caused by linearization point changes during iterative optimization; state initialization based on the wheel odometer and IMU enables a quick and reliable calculation of the initial state values required by the state estimator in both stationary and moving states. Comparative experiments were conducted in room-scale scenes, building scale scenes, and visual loss scenarios. The results showed that the proposed algorithm is highly accurate—2.2 m of cumulative error after moving 812 m (0.28%, loopback optimization disabled)—robust, and has an effective localization capability even in the event of sensor loss, including visual loss. The accuracy and robustness of the proposed method are superior to those of monocular visual inertia SLAM and traditional wheel odometers.
format Online
Article
Text
id pubmed-8402045
institution National Center for Biotechnology Information
language English
publishDate 2021
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-84020452021-08-29 Robust Tightly Coupled Pose Measurement Based on Multi-Sensor Fusion in Mobile Robot System Peng, Gang Lu, Zezao Peng, Jiaxi He, Dingxin Li, Xinde Hu, Bin Sensors (Basel) Article Currently, simultaneous localization and mapping (SLAM) is one of the main research topics in the robotics field. Visual-inertia SLAM, which consists of a camera and an inertial measurement unit (IMU), can significantly improve robustness and enable scale weak-visibility, whereas monocular visual SLAM is scale-invisible. For ground mobile robots, the introduction of a wheel speed sensor can solve the scale weak-visibility problem and improve robustness under abnormal conditions. In this paper, a multi-sensor fusion SLAM algorithm using monocular vision, inertia, and wheel speed measurements is proposed. The sensor measurements are combined in a tightly coupled manner, and a nonlinear optimization method is used to maximize the posterior probability to solve the optimal state estimation. Loop detection and back-end optimization are added to help reduce or even eliminate the cumulative error of the estimated poses, thus ensuring global consistency of the trajectory and map. The outstanding contribution of this paper is that the wheel odometer pre-integration algorithm, which combines the chassis speed and IMU angular speed, can avoid the repeated integration caused by linearization point changes during iterative optimization; state initialization based on the wheel odometer and IMU enables a quick and reliable calculation of the initial state values required by the state estimator in both stationary and moving states. Comparative experiments were conducted in room-scale scenes, building scale scenes, and visual loss scenarios. The results showed that the proposed algorithm is highly accurate—2.2 m of cumulative error after moving 812 m (0.28%, loopback optimization disabled)—robust, and has an effective localization capability even in the event of sensor loss, including visual loss. The accuracy and robustness of the proposed method are superior to those of monocular visual inertia SLAM and traditional wheel odometers. MDPI 2021-08-17 /pmc/articles/PMC8402045/ /pubmed/34450964 http://dx.doi.org/10.3390/s21165522 Text en © 2021 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Peng, Gang
Lu, Zezao
Peng, Jiaxi
He, Dingxin
Li, Xinde
Hu, Bin
Robust Tightly Coupled Pose Measurement Based on Multi-Sensor Fusion in Mobile Robot System
title Robust Tightly Coupled Pose Measurement Based on Multi-Sensor Fusion in Mobile Robot System
title_full Robust Tightly Coupled Pose Measurement Based on Multi-Sensor Fusion in Mobile Robot System
title_fullStr Robust Tightly Coupled Pose Measurement Based on Multi-Sensor Fusion in Mobile Robot System
title_full_unstemmed Robust Tightly Coupled Pose Measurement Based on Multi-Sensor Fusion in Mobile Robot System
title_short Robust Tightly Coupled Pose Measurement Based on Multi-Sensor Fusion in Mobile Robot System
title_sort robust tightly coupled pose measurement based on multi-sensor fusion in mobile robot system
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8402045/
https://www.ncbi.nlm.nih.gov/pubmed/34450964
http://dx.doi.org/10.3390/s21165522
work_keys_str_mv AT penggang robusttightlycoupledposemeasurementbasedonmultisensorfusioninmobilerobotsystem
AT luzezao robusttightlycoupledposemeasurementbasedonmultisensorfusioninmobilerobotsystem
AT pengjiaxi robusttightlycoupledposemeasurementbasedonmultisensorfusioninmobilerobotsystem
AT hedingxin robusttightlycoupledposemeasurementbasedonmultisensorfusioninmobilerobotsystem
AT lixinde robusttightlycoupledposemeasurementbasedonmultisensorfusioninmobilerobotsystem
AT hubin robusttightlycoupledposemeasurementbasedonmultisensorfusioninmobilerobotsystem