Cargando…
Enhanced RGB-D Mapping Method for Detailed 3D Indoor and Outdoor Modeling
RGB-D sensors (sensors with RGB camera and Depth camera) are novel sensing systems that capture RGB images along with pixel-wise depth information. Although they are widely used in various applications, RGB-D sensors have significant drawbacks including limited measurement ranges (e.g., within 3 m)...
Autores principales: | , , , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
MDPI
2016
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5087378/ https://www.ncbi.nlm.nih.gov/pubmed/27690028 http://dx.doi.org/10.3390/s16101589 |
_version_ | 1782463894090940416 |
---|---|
author | Tang, Shengjun Zhu, Qing Chen, Wu Darwish, Walid Wu, Bo Hu, Han Chen, Min |
author_facet | Tang, Shengjun Zhu, Qing Chen, Wu Darwish, Walid Wu, Bo Hu, Han Chen, Min |
author_sort | Tang, Shengjun |
collection | PubMed |
description | RGB-D sensors (sensors with RGB camera and Depth camera) are novel sensing systems that capture RGB images along with pixel-wise depth information. Although they are widely used in various applications, RGB-D sensors have significant drawbacks including limited measurement ranges (e.g., within 3 m) and errors in depth measurement increase with distance from the sensor with respect to 3D dense mapping. In this paper, we present a novel approach to geometrically integrate the depth scene and RGB scene to enlarge the measurement distance of RGB-D sensors and enrich the details of model generated from depth images. First, precise calibration for RGB-D Sensors is introduced. In addition to the calibration of internal and external parameters for both, IR camera and RGB camera, the relative pose between RGB camera and IR camera is also calibrated. Second, to ensure poses accuracy of RGB images, a refined false features matches rejection method is introduced by combining the depth information and initial camera poses between frames of the RGB-D sensor. Then, a global optimization model is used to improve the accuracy of the camera pose, decreasing the inconsistencies between the depth frames in advance. In order to eliminate the geometric inconsistencies between RGB scene and depth scene, the scale ambiguity problem encountered during the pose estimation with RGB image sequences can be resolved by integrating the depth and visual information and a robust rigid-transformation recovery method is developed to register RGB scene to depth scene. The benefit of the proposed joint optimization method is firstly evaluated with the publicly available benchmark datasets collected with Kinect. Then, the proposed method is examined by tests with two sets of datasets collected in both outside and inside environments. The experimental results demonstrate the feasibility and robustness of the proposed method. |
format | Online Article Text |
id | pubmed-5087378 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2016 |
publisher | MDPI |
record_format | MEDLINE/PubMed |
spelling | pubmed-50873782016-11-07 Enhanced RGB-D Mapping Method for Detailed 3D Indoor and Outdoor Modeling Tang, Shengjun Zhu, Qing Chen, Wu Darwish, Walid Wu, Bo Hu, Han Chen, Min Sensors (Basel) Article RGB-D sensors (sensors with RGB camera and Depth camera) are novel sensing systems that capture RGB images along with pixel-wise depth information. Although they are widely used in various applications, RGB-D sensors have significant drawbacks including limited measurement ranges (e.g., within 3 m) and errors in depth measurement increase with distance from the sensor with respect to 3D dense mapping. In this paper, we present a novel approach to geometrically integrate the depth scene and RGB scene to enlarge the measurement distance of RGB-D sensors and enrich the details of model generated from depth images. First, precise calibration for RGB-D Sensors is introduced. In addition to the calibration of internal and external parameters for both, IR camera and RGB camera, the relative pose between RGB camera and IR camera is also calibrated. Second, to ensure poses accuracy of RGB images, a refined false features matches rejection method is introduced by combining the depth information and initial camera poses between frames of the RGB-D sensor. Then, a global optimization model is used to improve the accuracy of the camera pose, decreasing the inconsistencies between the depth frames in advance. In order to eliminate the geometric inconsistencies between RGB scene and depth scene, the scale ambiguity problem encountered during the pose estimation with RGB image sequences can be resolved by integrating the depth and visual information and a robust rigid-transformation recovery method is developed to register RGB scene to depth scene. The benefit of the proposed joint optimization method is firstly evaluated with the publicly available benchmark datasets collected with Kinect. Then, the proposed method is examined by tests with two sets of datasets collected in both outside and inside environments. The experimental results demonstrate the feasibility and robustness of the proposed method. MDPI 2016-09-27 /pmc/articles/PMC5087378/ /pubmed/27690028 http://dx.doi.org/10.3390/s16101589 Text en © 2016 by the authors; licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC-BY) license (http://creativecommons.org/licenses/by/4.0/). |
spellingShingle | Article Tang, Shengjun Zhu, Qing Chen, Wu Darwish, Walid Wu, Bo Hu, Han Chen, Min Enhanced RGB-D Mapping Method for Detailed 3D Indoor and Outdoor Modeling |
title | Enhanced RGB-D Mapping Method for Detailed 3D Indoor and Outdoor Modeling |
title_full | Enhanced RGB-D Mapping Method for Detailed 3D Indoor and Outdoor Modeling |
title_fullStr | Enhanced RGB-D Mapping Method for Detailed 3D Indoor and Outdoor Modeling |
title_full_unstemmed | Enhanced RGB-D Mapping Method for Detailed 3D Indoor and Outdoor Modeling |
title_short | Enhanced RGB-D Mapping Method for Detailed 3D Indoor and Outdoor Modeling |
title_sort | enhanced rgb-d mapping method for detailed 3d indoor and outdoor modeling |
topic | Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5087378/ https://www.ncbi.nlm.nih.gov/pubmed/27690028 http://dx.doi.org/10.3390/s16101589 |
work_keys_str_mv | AT tangshengjun enhancedrgbdmappingmethodfordetailed3dindoorandoutdoormodeling AT zhuqing enhancedrgbdmappingmethodfordetailed3dindoorandoutdoormodeling AT chenwu enhancedrgbdmappingmethodfordetailed3dindoorandoutdoormodeling AT darwishwalid enhancedrgbdmappingmethodfordetailed3dindoorandoutdoormodeling AT wubo enhancedrgbdmappingmethodfordetailed3dindoorandoutdoormodeling AT huhan enhancedrgbdmappingmethodfordetailed3dindoorandoutdoormodeling AT chenmin enhancedrgbdmappingmethodfordetailed3dindoorandoutdoormodeling |