Cargando…

An Enhanced Hybrid Visual–Inertial Odometry System for Indoor Mobile Robot

As mobile robots are being widely used, accurate localization of the robot counts for the system. Compared with position systems with a single sensor, multi-sensor fusion systems provide better performance and increase the accuracy and robustness. At present, camera and IMU (Inertial Measurement Uni...

Descripción completa

Detalles Bibliográficos
Autores principales: Liu, Yanjie, Zhao, Changsen, Ren, Meixuan
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9024916/
https://www.ncbi.nlm.nih.gov/pubmed/35458915
http://dx.doi.org/10.3390/s22082930
_version_ 1784690731362811904
author Liu, Yanjie
Zhao, Changsen
Ren, Meixuan
author_facet Liu, Yanjie
Zhao, Changsen
Ren, Meixuan
author_sort Liu, Yanjie
collection PubMed
description As mobile robots are being widely used, accurate localization of the robot counts for the system. Compared with position systems with a single sensor, multi-sensor fusion systems provide better performance and increase the accuracy and robustness. At present, camera and IMU (Inertial Measurement Unit) fusion positioning is extensively studied and many representative Visual–Inertial Odometry (VIO) systems have been produced. Multi-State Constraint Kalman Filter (MSCKF), one of the tightly coupled filtering methods, is characterized by high accuracy and low computational load among typical VIO methods. In the general framework, IMU information is not used after predicting the state and covariance propagation. In this article, we proposed a framework which introduce IMU pre-integration result into MSCKF framework as observation information to improve the system positioning accuracy. Additionally, the system uses the Helmert variance component estimation (HVCE) method to adjust the weight between feature points and pre-integration to further improve the positioning accuracy. Similarly, this article uses the wheel odometer information of the mobile robot to perform zero speed detection, zero-speed update, and pre-integration update to enhance the positioning accuracy of the system. Finally, after experiments carried out in Gazebo simulation environment, public dataset and real scenarios, it is proved that the proposed algorithm has better accuracy results while ensuring real-time performance than existing mainstream algorithms.
format Online
Article
Text
id pubmed-9024916
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-90249162022-04-23 An Enhanced Hybrid Visual–Inertial Odometry System for Indoor Mobile Robot Liu, Yanjie Zhao, Changsen Ren, Meixuan Sensors (Basel) Article As mobile robots are being widely used, accurate localization of the robot counts for the system. Compared with position systems with a single sensor, multi-sensor fusion systems provide better performance and increase the accuracy and robustness. At present, camera and IMU (Inertial Measurement Unit) fusion positioning is extensively studied and many representative Visual–Inertial Odometry (VIO) systems have been produced. Multi-State Constraint Kalman Filter (MSCKF), one of the tightly coupled filtering methods, is characterized by high accuracy and low computational load among typical VIO methods. In the general framework, IMU information is not used after predicting the state and covariance propagation. In this article, we proposed a framework which introduce IMU pre-integration result into MSCKF framework as observation information to improve the system positioning accuracy. Additionally, the system uses the Helmert variance component estimation (HVCE) method to adjust the weight between feature points and pre-integration to further improve the positioning accuracy. Similarly, this article uses the wheel odometer information of the mobile robot to perform zero speed detection, zero-speed update, and pre-integration update to enhance the positioning accuracy of the system. Finally, after experiments carried out in Gazebo simulation environment, public dataset and real scenarios, it is proved that the proposed algorithm has better accuracy results while ensuring real-time performance than existing mainstream algorithms. MDPI 2022-04-11 /pmc/articles/PMC9024916/ /pubmed/35458915 http://dx.doi.org/10.3390/s22082930 Text en © 2022 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Liu, Yanjie
Zhao, Changsen
Ren, Meixuan
An Enhanced Hybrid Visual–Inertial Odometry System for Indoor Mobile Robot
title An Enhanced Hybrid Visual–Inertial Odometry System for Indoor Mobile Robot
title_full An Enhanced Hybrid Visual–Inertial Odometry System for Indoor Mobile Robot
title_fullStr An Enhanced Hybrid Visual–Inertial Odometry System for Indoor Mobile Robot
title_full_unstemmed An Enhanced Hybrid Visual–Inertial Odometry System for Indoor Mobile Robot
title_short An Enhanced Hybrid Visual–Inertial Odometry System for Indoor Mobile Robot
title_sort enhanced hybrid visual–inertial odometry system for indoor mobile robot
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9024916/
https://www.ncbi.nlm.nih.gov/pubmed/35458915
http://dx.doi.org/10.3390/s22082930
work_keys_str_mv AT liuyanjie anenhancedhybridvisualinertialodometrysystemforindoormobilerobot
AT zhaochangsen anenhancedhybridvisualinertialodometrysystemforindoormobilerobot
AT renmeixuan anenhancedhybridvisualinertialodometrysystemforindoormobilerobot
AT liuyanjie enhancedhybridvisualinertialodometrysystemforindoormobilerobot
AT zhaochangsen enhancedhybridvisualinertialodometrysystemforindoormobilerobot
AT renmeixuan enhancedhybridvisualinertialodometrysystemforindoormobilerobot