Cargando…

An Inertial and Optical Sensor Fusion Approach for Six Degree-of-Freedom Pose Estimation

Optical tracking provides relatively high accuracy over a large workspace but requires line-of-sight between the camera and the markers, which may be difficult to maintain in actual applications. In contrast, inertial sensing does not require line-of-sight but is subject to drift, which may cause la...

Descripción completa

Detalles Bibliográficos
Autores principales: He, Changyu, Kazanzides, Peter, Sen, Hasan Tutkun, Kim, Sungmin, Liu, Yue
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2015
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4541887/
https://www.ncbi.nlm.nih.gov/pubmed/26184191
http://dx.doi.org/10.3390/s150716448
_version_ 1782386455967956992
author He, Changyu
Kazanzides, Peter
Sen, Hasan Tutkun
Kim, Sungmin
Liu, Yue
author_facet He, Changyu
Kazanzides, Peter
Sen, Hasan Tutkun
Kim, Sungmin
Liu, Yue
author_sort He, Changyu
collection PubMed
description Optical tracking provides relatively high accuracy over a large workspace but requires line-of-sight between the camera and the markers, which may be difficult to maintain in actual applications. In contrast, inertial sensing does not require line-of-sight but is subject to drift, which may cause large cumulative errors, especially during the measurement of position. To handle cases where some or all of the markers are occluded, this paper proposes an inertial and optical sensor fusion approach in which the bias of the inertial sensors is estimated when the optical tracker provides full six degree-of-freedom (6-DOF) pose information. As long as the position of at least one marker can be tracked by the optical system, the 3-DOF position can be combined with the orientation estimated from the inertial measurements to recover the full 6-DOF pose information. When all the markers are occluded, the position tracking relies on the inertial sensors that are bias-corrected by the optical tracking system. Experiments are performed with an augmented reality head-mounted display (ARHMD) that integrates an optical tracking system (OTS) and inertial measurement unit (IMU). Experimental results show that under partial occlusion conditions, the root mean square errors (RMSE) of orientation and position are 0.04° and 0.134 mm, and under total occlusion conditions for 1 s, the orientation and position RMSE are 0.022° and 0.22 mm, respectively. Thus, the proposed sensor fusion approach can provide reliable 6-DOF pose under long-term partial occlusion and short-term total occlusion conditions.
format Online
Article
Text
id pubmed-4541887
institution National Center for Biotechnology Information
language English
publishDate 2015
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-45418872015-08-26 An Inertial and Optical Sensor Fusion Approach for Six Degree-of-Freedom Pose Estimation He, Changyu Kazanzides, Peter Sen, Hasan Tutkun Kim, Sungmin Liu, Yue Sensors (Basel) Article Optical tracking provides relatively high accuracy over a large workspace but requires line-of-sight between the camera and the markers, which may be difficult to maintain in actual applications. In contrast, inertial sensing does not require line-of-sight but is subject to drift, which may cause large cumulative errors, especially during the measurement of position. To handle cases where some or all of the markers are occluded, this paper proposes an inertial and optical sensor fusion approach in which the bias of the inertial sensors is estimated when the optical tracker provides full six degree-of-freedom (6-DOF) pose information. As long as the position of at least one marker can be tracked by the optical system, the 3-DOF position can be combined with the orientation estimated from the inertial measurements to recover the full 6-DOF pose information. When all the markers are occluded, the position tracking relies on the inertial sensors that are bias-corrected by the optical tracking system. Experiments are performed with an augmented reality head-mounted display (ARHMD) that integrates an optical tracking system (OTS) and inertial measurement unit (IMU). Experimental results show that under partial occlusion conditions, the root mean square errors (RMSE) of orientation and position are 0.04° and 0.134 mm, and under total occlusion conditions for 1 s, the orientation and position RMSE are 0.022° and 0.22 mm, respectively. Thus, the proposed sensor fusion approach can provide reliable 6-DOF pose under long-term partial occlusion and short-term total occlusion conditions. MDPI 2015-07-08 /pmc/articles/PMC4541887/ /pubmed/26184191 http://dx.doi.org/10.3390/s150716448 Text en © 2015 by the authors; licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution license (http://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
He, Changyu
Kazanzides, Peter
Sen, Hasan Tutkun
Kim, Sungmin
Liu, Yue
An Inertial and Optical Sensor Fusion Approach for Six Degree-of-Freedom Pose Estimation
title An Inertial and Optical Sensor Fusion Approach for Six Degree-of-Freedom Pose Estimation
title_full An Inertial and Optical Sensor Fusion Approach for Six Degree-of-Freedom Pose Estimation
title_fullStr An Inertial and Optical Sensor Fusion Approach for Six Degree-of-Freedom Pose Estimation
title_full_unstemmed An Inertial and Optical Sensor Fusion Approach for Six Degree-of-Freedom Pose Estimation
title_short An Inertial and Optical Sensor Fusion Approach for Six Degree-of-Freedom Pose Estimation
title_sort inertial and optical sensor fusion approach for six degree-of-freedom pose estimation
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4541887/
https://www.ncbi.nlm.nih.gov/pubmed/26184191
http://dx.doi.org/10.3390/s150716448
work_keys_str_mv AT hechangyu aninertialandopticalsensorfusionapproachforsixdegreeoffreedomposeestimation
AT kazanzidespeter aninertialandopticalsensorfusionapproachforsixdegreeoffreedomposeestimation
AT senhasantutkun aninertialandopticalsensorfusionapproachforsixdegreeoffreedomposeestimation
AT kimsungmin aninertialandopticalsensorfusionapproachforsixdegreeoffreedomposeestimation
AT liuyue aninertialandopticalsensorfusionapproachforsixdegreeoffreedomposeestimation
AT hechangyu inertialandopticalsensorfusionapproachforsixdegreeoffreedomposeestimation
AT kazanzidespeter inertialandopticalsensorfusionapproachforsixdegreeoffreedomposeestimation
AT senhasantutkun inertialandopticalsensorfusionapproachforsixdegreeoffreedomposeestimation
AT kimsungmin inertialandopticalsensorfusionapproachforsixdegreeoffreedomposeestimation
AT liuyue inertialandopticalsensorfusionapproachforsixdegreeoffreedomposeestimation