Cargando…
Visual Features Assisted Robot Localization in Symmetrical Environment Using Laser SLAM
Localization for estimating the position and orientation of a robot in an asymmetrical environment has been solved by using various 2D laser rangefinder simultaneous localization and mapping (SLAM) approaches. Laser-based SLAM generates an occupancy grid map, then the most popular Monte Carlo Locali...
Autores principales: | , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
MDPI
2021
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7961336/ https://www.ncbi.nlm.nih.gov/pubmed/33806414 http://dx.doi.org/10.3390/s21051772 |
_version_ | 1783665237080145920 |
---|---|
author | Ge, Gengyu Zhang, Yi Jiang, Qin Wang, Wei |
author_facet | Ge, Gengyu Zhang, Yi Jiang, Qin Wang, Wei |
author_sort | Ge, Gengyu |
collection | PubMed |
description | Localization for estimating the position and orientation of a robot in an asymmetrical environment has been solved by using various 2D laser rangefinder simultaneous localization and mapping (SLAM) approaches. Laser-based SLAM generates an occupancy grid map, then the most popular Monte Carlo Localization (MCL) method spreads particles on the map and calculates the position of the robot by a probabilistic algorithm. However, this can be difficult, especially in symmetrical environments, because landmarks or features may not be sufficient to determine the robot’s orientation. Sometimes the position is not unique if a robot does not stay at the geometric center. This paper presents a novel approach to solving the robot localization problem in a symmetrical environment using the visual features-assisted method. Laser range measurements are used to estimate the robot position, while visual features determine its orientation. Firstly, we convert laser range scans raw data into coordinate data and calculate the geometric center. Secondly, we calculate the new distance from the geometric center point to all end points and find the longest distances. Then, we compare those distances, fit lines, extract corner points, and calculate the distance between adjacent corner points to determine whether the environment is symmetrical. Finally, if the environment is symmetrical, visual features based on the ORB keypoint detector and descriptor will be added to the system to determine the orientation of the robot. The experimental results show that our approach can successfully determine the position of the robot in a symmetrical environment, while ordinary MCL and its extension localization method always fail. |
format | Online Article Text |
id | pubmed-7961336 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2021 |
publisher | MDPI |
record_format | MEDLINE/PubMed |
spelling | pubmed-79613362021-03-17 Visual Features Assisted Robot Localization in Symmetrical Environment Using Laser SLAM Ge, Gengyu Zhang, Yi Jiang, Qin Wang, Wei Sensors (Basel) Article Localization for estimating the position and orientation of a robot in an asymmetrical environment has been solved by using various 2D laser rangefinder simultaneous localization and mapping (SLAM) approaches. Laser-based SLAM generates an occupancy grid map, then the most popular Monte Carlo Localization (MCL) method spreads particles on the map and calculates the position of the robot by a probabilistic algorithm. However, this can be difficult, especially in symmetrical environments, because landmarks or features may not be sufficient to determine the robot’s orientation. Sometimes the position is not unique if a robot does not stay at the geometric center. This paper presents a novel approach to solving the robot localization problem in a symmetrical environment using the visual features-assisted method. Laser range measurements are used to estimate the robot position, while visual features determine its orientation. Firstly, we convert laser range scans raw data into coordinate data and calculate the geometric center. Secondly, we calculate the new distance from the geometric center point to all end points and find the longest distances. Then, we compare those distances, fit lines, extract corner points, and calculate the distance between adjacent corner points to determine whether the environment is symmetrical. Finally, if the environment is symmetrical, visual features based on the ORB keypoint detector and descriptor will be added to the system to determine the orientation of the robot. The experimental results show that our approach can successfully determine the position of the robot in a symmetrical environment, while ordinary MCL and its extension localization method always fail. MDPI 2021-03-04 /pmc/articles/PMC7961336/ /pubmed/33806414 http://dx.doi.org/10.3390/s21051772 Text en © 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/). |
spellingShingle | Article Ge, Gengyu Zhang, Yi Jiang, Qin Wang, Wei Visual Features Assisted Robot Localization in Symmetrical Environment Using Laser SLAM |
title | Visual Features Assisted Robot Localization in Symmetrical Environment Using Laser SLAM |
title_full | Visual Features Assisted Robot Localization in Symmetrical Environment Using Laser SLAM |
title_fullStr | Visual Features Assisted Robot Localization in Symmetrical Environment Using Laser SLAM |
title_full_unstemmed | Visual Features Assisted Robot Localization in Symmetrical Environment Using Laser SLAM |
title_short | Visual Features Assisted Robot Localization in Symmetrical Environment Using Laser SLAM |
title_sort | visual features assisted robot localization in symmetrical environment using laser slam |
topic | Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7961336/ https://www.ncbi.nlm.nih.gov/pubmed/33806414 http://dx.doi.org/10.3390/s21051772 |
work_keys_str_mv | AT gegengyu visualfeaturesassistedrobotlocalizationinsymmetricalenvironmentusinglaserslam AT zhangyi visualfeaturesassistedrobotlocalizationinsymmetricalenvironmentusinglaserslam AT jiangqin visualfeaturesassistedrobotlocalizationinsymmetricalenvironmentusinglaserslam AT wangwei visualfeaturesassistedrobotlocalizationinsymmetricalenvironmentusinglaserslam |