Cargando…

A Robust and Integrated Visual Odometry Framework Exploiting the Optical Flow and Feature Point Method

In this paper, we propose a robust and integrated visual odometry framework exploiting the optical flow and feature point method that achieves faster pose estimate and considerable accuracy and robustness during the odometry process. Our method utilizes optical flow tracking to accelerate the featur...

Descripción completa

Detalles Bibliográficos
Autores principales: Qiu, Haiyang, Zhang, Xu, Wang, Hui, Xiang, Dan, Xiao, Mingming, Zhu, Zhiyu, Wang, Lei
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10611077/
https://www.ncbi.nlm.nih.gov/pubmed/37896748
http://dx.doi.org/10.3390/s23208655
_version_ 1785128406791225344
author Qiu, Haiyang
Zhang, Xu
Wang, Hui
Xiang, Dan
Xiao, Mingming
Zhu, Zhiyu
Wang, Lei
author_facet Qiu, Haiyang
Zhang, Xu
Wang, Hui
Xiang, Dan
Xiao, Mingming
Zhu, Zhiyu
Wang, Lei
author_sort Qiu, Haiyang
collection PubMed
description In this paper, we propose a robust and integrated visual odometry framework exploiting the optical flow and feature point method that achieves faster pose estimate and considerable accuracy and robustness during the odometry process. Our method utilizes optical flow tracking to accelerate the feature point matching process. In the odometry, two visual odometry methods are used: global feature point method and local feature point method. When there is good optical flow tracking and enough key points optical flow tracking matching is successful, the local feature point method utilizes prior information from the optical flow to estimate relative pose transformation information. In cases where there is poor optical flow tracking and only a small number of key points successfully match, the feature point method with a filtering mechanism is used for posing estimation. By coupling and correlating the two aforementioned methods, this visual odometry greatly accelerates the computation time for relative pose estimation. It reduces the computation time of relative pose estimation to 40% of that of the ORB_SLAM3 front-end odometry, while ensuring that it is not too different from the ORB_SLAM3 front-end odometry in terms of accuracy and robustness. The effectiveness of this method was validated and analyzed using the EUROC dataset within the ORB_SLAM3 open-source framework. The experimental results serve as supporting evidence for the efficacy of the proposed approach.
format Online
Article
Text
id pubmed-10611077
institution National Center for Biotechnology Information
language English
publishDate 2023
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-106110772023-10-28 A Robust and Integrated Visual Odometry Framework Exploiting the Optical Flow and Feature Point Method Qiu, Haiyang Zhang, Xu Wang, Hui Xiang, Dan Xiao, Mingming Zhu, Zhiyu Wang, Lei Sensors (Basel) Article In this paper, we propose a robust and integrated visual odometry framework exploiting the optical flow and feature point method that achieves faster pose estimate and considerable accuracy and robustness during the odometry process. Our method utilizes optical flow tracking to accelerate the feature point matching process. In the odometry, two visual odometry methods are used: global feature point method and local feature point method. When there is good optical flow tracking and enough key points optical flow tracking matching is successful, the local feature point method utilizes prior information from the optical flow to estimate relative pose transformation information. In cases where there is poor optical flow tracking and only a small number of key points successfully match, the feature point method with a filtering mechanism is used for posing estimation. By coupling and correlating the two aforementioned methods, this visual odometry greatly accelerates the computation time for relative pose estimation. It reduces the computation time of relative pose estimation to 40% of that of the ORB_SLAM3 front-end odometry, while ensuring that it is not too different from the ORB_SLAM3 front-end odometry in terms of accuracy and robustness. The effectiveness of this method was validated and analyzed using the EUROC dataset within the ORB_SLAM3 open-source framework. The experimental results serve as supporting evidence for the efficacy of the proposed approach. MDPI 2023-10-23 /pmc/articles/PMC10611077/ /pubmed/37896748 http://dx.doi.org/10.3390/s23208655 Text en © 2023 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Qiu, Haiyang
Zhang, Xu
Wang, Hui
Xiang, Dan
Xiao, Mingming
Zhu, Zhiyu
Wang, Lei
A Robust and Integrated Visual Odometry Framework Exploiting the Optical Flow and Feature Point Method
title A Robust and Integrated Visual Odometry Framework Exploiting the Optical Flow and Feature Point Method
title_full A Robust and Integrated Visual Odometry Framework Exploiting the Optical Flow and Feature Point Method
title_fullStr A Robust and Integrated Visual Odometry Framework Exploiting the Optical Flow and Feature Point Method
title_full_unstemmed A Robust and Integrated Visual Odometry Framework Exploiting the Optical Flow and Feature Point Method
title_short A Robust and Integrated Visual Odometry Framework Exploiting the Optical Flow and Feature Point Method
title_sort robust and integrated visual odometry framework exploiting the optical flow and feature point method
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10611077/
https://www.ncbi.nlm.nih.gov/pubmed/37896748
http://dx.doi.org/10.3390/s23208655
work_keys_str_mv AT qiuhaiyang arobustandintegratedvisualodometryframeworkexploitingtheopticalflowandfeaturepointmethod
AT zhangxu arobustandintegratedvisualodometryframeworkexploitingtheopticalflowandfeaturepointmethod
AT wanghui arobustandintegratedvisualodometryframeworkexploitingtheopticalflowandfeaturepointmethod
AT xiangdan arobustandintegratedvisualodometryframeworkexploitingtheopticalflowandfeaturepointmethod
AT xiaomingming arobustandintegratedvisualodometryframeworkexploitingtheopticalflowandfeaturepointmethod
AT zhuzhiyu arobustandintegratedvisualodometryframeworkexploitingtheopticalflowandfeaturepointmethod
AT wanglei arobustandintegratedvisualodometryframeworkexploitingtheopticalflowandfeaturepointmethod
AT qiuhaiyang robustandintegratedvisualodometryframeworkexploitingtheopticalflowandfeaturepointmethod
AT zhangxu robustandintegratedvisualodometryframeworkexploitingtheopticalflowandfeaturepointmethod
AT wanghui robustandintegratedvisualodometryframeworkexploitingtheopticalflowandfeaturepointmethod
AT xiangdan robustandintegratedvisualodometryframeworkexploitingtheopticalflowandfeaturepointmethod
AT xiaomingming robustandintegratedvisualodometryframeworkexploitingtheopticalflowandfeaturepointmethod
AT zhuzhiyu robustandintegratedvisualodometryframeworkexploitingtheopticalflowandfeaturepointmethod
AT wanglei robustandintegratedvisualodometryframeworkexploitingtheopticalflowandfeaturepointmethod