Cargando…
A Fast and Robust Extrinsic Calibration for RGB-D Camera Networks †
From object tracking to 3D reconstruction, RGB-Depth (RGB-D) camera networks play an increasingly important role in many vision and graphics applications. Practical applications often use sparsely-placed cameras to maximize visibility, while using as few cameras as possible to minimize cost. In gene...
Autores principales: | , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
MDPI
2018
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5795566/ https://www.ncbi.nlm.nih.gov/pubmed/29342968 http://dx.doi.org/10.3390/s18010235 |
_version_ | 1783297322639163392 |
---|---|
author | Su, Po-Chang Shen, Ju Xu, Wanxin Cheung, Sen-Ching S. Luo, Ying |
author_facet | Su, Po-Chang Shen, Ju Xu, Wanxin Cheung, Sen-Ching S. Luo, Ying |
author_sort | Su, Po-Chang |
collection | PubMed |
description | From object tracking to 3D reconstruction, RGB-Depth (RGB-D) camera networks play an increasingly important role in many vision and graphics applications. Practical applications often use sparsely-placed cameras to maximize visibility, while using as few cameras as possible to minimize cost. In general, it is challenging to calibrate sparse camera networks due to the lack of shared scene features across different camera views. In this paper, we propose a novel algorithm that can accurately and rapidly calibrate the geometric relationships across an arbitrary number of RGB-D cameras on a network. Our work has a number of novel features. First, to cope with the wide separation between different cameras, we establish view correspondences by using a spherical calibration object. We show that this approach outperforms other techniques based on planar calibration objects. Second, instead of modeling camera extrinsic calibration using rigid transformation, which is optimal only for pinhole cameras, we systematically test different view transformation functions including rigid transformation, polynomial transformation and manifold regression to determine the most robust mapping that generalizes well to unseen data. Third, we reformulate the celebrated bundle adjustment procedure to minimize the global 3D reprojection error so as to fine-tune the initial estimates. Finally, our scalable client-server architecture is computationally efficient: the calibration of a five-camera system, including data capture, can be done in minutes using only commodity PCs. Our proposed framework is compared with other state-of-the-arts systems using both quantitative measurements and visual alignment results of the merged point clouds. |
format | Online Article Text |
id | pubmed-5795566 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2018 |
publisher | MDPI |
record_format | MEDLINE/PubMed |
spelling | pubmed-57955662018-02-13 A Fast and Robust Extrinsic Calibration for RGB-D Camera Networks † Su, Po-Chang Shen, Ju Xu, Wanxin Cheung, Sen-Ching S. Luo, Ying Sensors (Basel) Article From object tracking to 3D reconstruction, RGB-Depth (RGB-D) camera networks play an increasingly important role in many vision and graphics applications. Practical applications often use sparsely-placed cameras to maximize visibility, while using as few cameras as possible to minimize cost. In general, it is challenging to calibrate sparse camera networks due to the lack of shared scene features across different camera views. In this paper, we propose a novel algorithm that can accurately and rapidly calibrate the geometric relationships across an arbitrary number of RGB-D cameras on a network. Our work has a number of novel features. First, to cope with the wide separation between different cameras, we establish view correspondences by using a spherical calibration object. We show that this approach outperforms other techniques based on planar calibration objects. Second, instead of modeling camera extrinsic calibration using rigid transformation, which is optimal only for pinhole cameras, we systematically test different view transformation functions including rigid transformation, polynomial transformation and manifold regression to determine the most robust mapping that generalizes well to unseen data. Third, we reformulate the celebrated bundle adjustment procedure to minimize the global 3D reprojection error so as to fine-tune the initial estimates. Finally, our scalable client-server architecture is computationally efficient: the calibration of a five-camera system, including data capture, can be done in minutes using only commodity PCs. Our proposed framework is compared with other state-of-the-arts systems using both quantitative measurements and visual alignment results of the merged point clouds. MDPI 2018-01-15 /pmc/articles/PMC5795566/ /pubmed/29342968 http://dx.doi.org/10.3390/s18010235 Text en © 2018 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/). |
spellingShingle | Article Su, Po-Chang Shen, Ju Xu, Wanxin Cheung, Sen-Ching S. Luo, Ying A Fast and Robust Extrinsic Calibration for RGB-D Camera Networks † |
title | A Fast and Robust Extrinsic Calibration for RGB-D Camera Networks † |
title_full | A Fast and Robust Extrinsic Calibration for RGB-D Camera Networks † |
title_fullStr | A Fast and Robust Extrinsic Calibration for RGB-D Camera Networks † |
title_full_unstemmed | A Fast and Robust Extrinsic Calibration for RGB-D Camera Networks † |
title_short | A Fast and Robust Extrinsic Calibration for RGB-D Camera Networks † |
title_sort | fast and robust extrinsic calibration for rgb-d camera networks † |
topic | Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5795566/ https://www.ncbi.nlm.nih.gov/pubmed/29342968 http://dx.doi.org/10.3390/s18010235 |
work_keys_str_mv | AT supochang afastandrobustextrinsiccalibrationforrgbdcameranetworks AT shenju afastandrobustextrinsiccalibrationforrgbdcameranetworks AT xuwanxin afastandrobustextrinsiccalibrationforrgbdcameranetworks AT cheungsenchings afastandrobustextrinsiccalibrationforrgbdcameranetworks AT luoying afastandrobustextrinsiccalibrationforrgbdcameranetworks AT supochang fastandrobustextrinsiccalibrationforrgbdcameranetworks AT shenju fastandrobustextrinsiccalibrationforrgbdcameranetworks AT xuwanxin fastandrobustextrinsiccalibrationforrgbdcameranetworks AT cheungsenchings fastandrobustextrinsiccalibrationforrgbdcameranetworks AT luoying fastandrobustextrinsiccalibrationforrgbdcameranetworks |