Cargando…

Adaptive Absolute Ego-Motion Estimation Using Wearable Visual-Inertial Sensors for Indoor Positioning

This paper proposes an adaptive absolute ego-motion estimation method using wearable visual-inertial sensors for indoor positioning. We introduce a wearable visual-inertial device to estimate not only the camera ego-motion, but also the 3D motion of the moving object in dynamic environments. Firstly...

Descripción completa

Detalles Bibliográficos
Autores principales: Tian, Ya, Chen, Zhe, Lu, Shouyin, Tan, Jindong
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2018
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6187464/
https://www.ncbi.nlm.nih.gov/pubmed/30424047
http://dx.doi.org/10.3390/mi9030113
_version_ 1783363025553588224
author Tian, Ya
Chen, Zhe
Lu, Shouyin
Tan, Jindong
author_facet Tian, Ya
Chen, Zhe
Lu, Shouyin
Tan, Jindong
author_sort Tian, Ya
collection PubMed
description This paper proposes an adaptive absolute ego-motion estimation method using wearable visual-inertial sensors for indoor positioning. We introduce a wearable visual-inertial device to estimate not only the camera ego-motion, but also the 3D motion of the moving object in dynamic environments. Firstly, a novel method dynamic scene segmentation is proposed using two visual geometry constraints with the help of inertial sensors. Moreover, this paper introduces a concept of “virtual camera” to consider the motion area related to each moving object as if a static object were viewed by a “virtual camera”. We therefore derive the 3D moving object’s motion from the motions for the real and virtual camera because the virtual camera’s motion is actually the combined motion of both the real camera and the moving object. In addition, a multi-rate linear Kalman-filter (MR-LKF) as our previous work was selected to solve both the problem of scale ambiguity in monocular camera tracking and the different sampling frequencies of visual and inertial sensors. The performance of the proposed method is evaluated by simulation studies and practical experiments performed in both static and dynamic environments. The results show the method’s robustness and effectiveness compared with the results from a Pioneer robot as the ground truth.
format Online
Article
Text
id pubmed-6187464
institution National Center for Biotechnology Information
language English
publishDate 2018
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-61874642018-11-01 Adaptive Absolute Ego-Motion Estimation Using Wearable Visual-Inertial Sensors for Indoor Positioning Tian, Ya Chen, Zhe Lu, Shouyin Tan, Jindong Micromachines (Basel) Article This paper proposes an adaptive absolute ego-motion estimation method using wearable visual-inertial sensors for indoor positioning. We introduce a wearable visual-inertial device to estimate not only the camera ego-motion, but also the 3D motion of the moving object in dynamic environments. Firstly, a novel method dynamic scene segmentation is proposed using two visual geometry constraints with the help of inertial sensors. Moreover, this paper introduces a concept of “virtual camera” to consider the motion area related to each moving object as if a static object were viewed by a “virtual camera”. We therefore derive the 3D moving object’s motion from the motions for the real and virtual camera because the virtual camera’s motion is actually the combined motion of both the real camera and the moving object. In addition, a multi-rate linear Kalman-filter (MR-LKF) as our previous work was selected to solve both the problem of scale ambiguity in monocular camera tracking and the different sampling frequencies of visual and inertial sensors. The performance of the proposed method is evaluated by simulation studies and practical experiments performed in both static and dynamic environments. The results show the method’s robustness and effectiveness compared with the results from a Pioneer robot as the ground truth. MDPI 2018-03-06 /pmc/articles/PMC6187464/ /pubmed/30424047 http://dx.doi.org/10.3390/mi9030113 Text en © 2018 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Tian, Ya
Chen, Zhe
Lu, Shouyin
Tan, Jindong
Adaptive Absolute Ego-Motion Estimation Using Wearable Visual-Inertial Sensors for Indoor Positioning
title Adaptive Absolute Ego-Motion Estimation Using Wearable Visual-Inertial Sensors for Indoor Positioning
title_full Adaptive Absolute Ego-Motion Estimation Using Wearable Visual-Inertial Sensors for Indoor Positioning
title_fullStr Adaptive Absolute Ego-Motion Estimation Using Wearable Visual-Inertial Sensors for Indoor Positioning
title_full_unstemmed Adaptive Absolute Ego-Motion Estimation Using Wearable Visual-Inertial Sensors for Indoor Positioning
title_short Adaptive Absolute Ego-Motion Estimation Using Wearable Visual-Inertial Sensors for Indoor Positioning
title_sort adaptive absolute ego-motion estimation using wearable visual-inertial sensors for indoor positioning
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6187464/
https://www.ncbi.nlm.nih.gov/pubmed/30424047
http://dx.doi.org/10.3390/mi9030113
work_keys_str_mv AT tianya adaptiveabsoluteegomotionestimationusingwearablevisualinertialsensorsforindoorpositioning
AT chenzhe adaptiveabsoluteegomotionestimationusingwearablevisualinertialsensorsforindoorpositioning
AT lushouyin adaptiveabsoluteegomotionestimationusingwearablevisualinertialsensorsforindoorpositioning
AT tanjindong adaptiveabsoluteegomotionestimationusingwearablevisualinertialsensorsforindoorpositioning