Cargando…

Robust Stereo Visual Odometry Using Improved RANSAC-Based Methods for Mobile Robot Localization

In this paper, we present a novel approach for stereo visual odometry with robust motion estimation that is faster and more accurate than standard RANSAC (Random Sample Consensus). Our method makes improvements in RANSAC in three aspects: first, the hypotheses are preferentially generated by samplin...

Descripción completa

Detalles Bibliográficos
Autores principales: Liu, Yanqing, Gu, Yuzhang, Li, Jiamao, Zhang, Xiaolin
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2017
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5677260/
https://www.ncbi.nlm.nih.gov/pubmed/29027935
http://dx.doi.org/10.3390/s17102339
_version_ 1783277208207360000
author Liu, Yanqing
Gu, Yuzhang
Li, Jiamao
Zhang, Xiaolin
author_facet Liu, Yanqing
Gu, Yuzhang
Li, Jiamao
Zhang, Xiaolin
author_sort Liu, Yanqing
collection PubMed
description In this paper, we present a novel approach for stereo visual odometry with robust motion estimation that is faster and more accurate than standard RANSAC (Random Sample Consensus). Our method makes improvements in RANSAC in three aspects: first, the hypotheses are preferentially generated by sampling the input feature points on the order of ages and similarities of the features; second, the evaluation of hypotheses is performed based on the SPRT (Sequential Probability Ratio Test) that makes bad hypotheses discarded very fast without verifying all the data points; third, we aggregate the three best hypotheses to get the final estimation instead of only selecting the best hypothesis. The first two aspects improve the speed of RANSAC by generating good hypotheses and discarding bad hypotheses in advance, respectively. The last aspect improves the accuracy of motion estimation. Our method was evaluated in the KITTI (Karlsruhe Institute of Technology and Toyota Technological Institute) and the New Tsukuba dataset. Experimental results show that the proposed method achieves better results for both speed and accuracy than RANSAC.
format Online
Article
Text
id pubmed-5677260
institution National Center for Biotechnology Information
language English
publishDate 2017
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-56772602017-11-17 Robust Stereo Visual Odometry Using Improved RANSAC-Based Methods for Mobile Robot Localization Liu, Yanqing Gu, Yuzhang Li, Jiamao Zhang, Xiaolin Sensors (Basel) Article In this paper, we present a novel approach for stereo visual odometry with robust motion estimation that is faster and more accurate than standard RANSAC (Random Sample Consensus). Our method makes improvements in RANSAC in three aspects: first, the hypotheses are preferentially generated by sampling the input feature points on the order of ages and similarities of the features; second, the evaluation of hypotheses is performed based on the SPRT (Sequential Probability Ratio Test) that makes bad hypotheses discarded very fast without verifying all the data points; third, we aggregate the three best hypotheses to get the final estimation instead of only selecting the best hypothesis. The first two aspects improve the speed of RANSAC by generating good hypotheses and discarding bad hypotheses in advance, respectively. The last aspect improves the accuracy of motion estimation. Our method was evaluated in the KITTI (Karlsruhe Institute of Technology and Toyota Technological Institute) and the New Tsukuba dataset. Experimental results show that the proposed method achieves better results for both speed and accuracy than RANSAC. MDPI 2017-10-13 /pmc/articles/PMC5677260/ /pubmed/29027935 http://dx.doi.org/10.3390/s17102339 Text en © 2017 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Liu, Yanqing
Gu, Yuzhang
Li, Jiamao
Zhang, Xiaolin
Robust Stereo Visual Odometry Using Improved RANSAC-Based Methods for Mobile Robot Localization
title Robust Stereo Visual Odometry Using Improved RANSAC-Based Methods for Mobile Robot Localization
title_full Robust Stereo Visual Odometry Using Improved RANSAC-Based Methods for Mobile Robot Localization
title_fullStr Robust Stereo Visual Odometry Using Improved RANSAC-Based Methods for Mobile Robot Localization
title_full_unstemmed Robust Stereo Visual Odometry Using Improved RANSAC-Based Methods for Mobile Robot Localization
title_short Robust Stereo Visual Odometry Using Improved RANSAC-Based Methods for Mobile Robot Localization
title_sort robust stereo visual odometry using improved ransac-based methods for mobile robot localization
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5677260/
https://www.ncbi.nlm.nih.gov/pubmed/29027935
http://dx.doi.org/10.3390/s17102339
work_keys_str_mv AT liuyanqing robuststereovisualodometryusingimprovedransacbasedmethodsformobilerobotlocalization
AT guyuzhang robuststereovisualodometryusingimprovedransacbasedmethodsformobilerobotlocalization
AT lijiamao robuststereovisualodometryusingimprovedransacbasedmethodsformobilerobotlocalization
AT zhangxiaolin robuststereovisualodometryusingimprovedransacbasedmethodsformobilerobotlocalization