Cargando…
GEUINF: Real-Time Visualization of Indoor Facilities Using Mixed Reality
Mixed reality (MR) enables a novel way to visualize virtual objects on real scenarios considering physical constraints. This technology arises with other significant advances in the field of sensors fusion for human-centric 3D capturing. Recent advances for scanning the user environment, real-time v...
Autores principales: | , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
MDPI
2021
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7915880/ https://www.ncbi.nlm.nih.gov/pubmed/33562761 http://dx.doi.org/10.3390/s21041123 |
_version_ | 1783657350135021568 |
---|---|
author | Jurado, David Jurado, Juan M. Ortega, Lidia Feito, Francisco R. |
author_facet | Jurado, David Jurado, Juan M. Ortega, Lidia Feito, Francisco R. |
author_sort | Jurado, David |
collection | PubMed |
description | Mixed reality (MR) enables a novel way to visualize virtual objects on real scenarios considering physical constraints. This technology arises with other significant advances in the field of sensors fusion for human-centric 3D capturing. Recent advances for scanning the user environment, real-time visualization and 3D vision using ubiquitous systems like smartphones allow us to capture 3D data from the real world. In this paper, a disruptive application for assessing the status of indoor infrastructures is proposed. The installation and maintenance of hidden facilities such as water pipes, electrical lines and air conditioning tubes, which are usually occluded behind the wall, supposes tedious and inefficient tasks. Most of these infrastructures are digitized but they cannot be visualized onsite. In this research, we focused on the development of a new application (GEUINF) to be launched on smartphones that are capable of capturing 3D data of the real world by depth sensing. This information is relevant to determine the user position and orientation. Although previous approaches used fixed markers for this purpose, our application enables the estimation of both parameters with a centimeter accuracy without them. This novelty is possible since our method is based on a matching process between reconstructed walls of the real world and 3D planes of the replicated world in a virtual environment. Our markerless approach is based on scanning planar surfaces of the user environment and then, these are geometrically aligned with their corresponding virtual 3D entities. In a preprocessing phase, the 2D CAD geometry available from an architectural project is used to generate 3D models of an indoor building structure. In real time, these virtual elements are tracked with the real ones modeled by using ARCore library. Once the alignment between virtual and real worlds is done, the application enables the visualization, navigation and interaction with the virtual facility networks in real-time. Thus, our method may be used by private companies and public institutions responsible of the indoor facilities management and also may be integrated with other applications focused on indoor navigation. |
format | Online Article Text |
id | pubmed-7915880 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2021 |
publisher | MDPI |
record_format | MEDLINE/PubMed |
spelling | pubmed-79158802021-03-01 GEUINF: Real-Time Visualization of Indoor Facilities Using Mixed Reality Jurado, David Jurado, Juan M. Ortega, Lidia Feito, Francisco R. Sensors (Basel) Article Mixed reality (MR) enables a novel way to visualize virtual objects on real scenarios considering physical constraints. This technology arises with other significant advances in the field of sensors fusion for human-centric 3D capturing. Recent advances for scanning the user environment, real-time visualization and 3D vision using ubiquitous systems like smartphones allow us to capture 3D data from the real world. In this paper, a disruptive application for assessing the status of indoor infrastructures is proposed. The installation and maintenance of hidden facilities such as water pipes, electrical lines and air conditioning tubes, which are usually occluded behind the wall, supposes tedious and inefficient tasks. Most of these infrastructures are digitized but they cannot be visualized onsite. In this research, we focused on the development of a new application (GEUINF) to be launched on smartphones that are capable of capturing 3D data of the real world by depth sensing. This information is relevant to determine the user position and orientation. Although previous approaches used fixed markers for this purpose, our application enables the estimation of both parameters with a centimeter accuracy without them. This novelty is possible since our method is based on a matching process between reconstructed walls of the real world and 3D planes of the replicated world in a virtual environment. Our markerless approach is based on scanning planar surfaces of the user environment and then, these are geometrically aligned with their corresponding virtual 3D entities. In a preprocessing phase, the 2D CAD geometry available from an architectural project is used to generate 3D models of an indoor building structure. In real time, these virtual elements are tracked with the real ones modeled by using ARCore library. Once the alignment between virtual and real worlds is done, the application enables the visualization, navigation and interaction with the virtual facility networks in real-time. Thus, our method may be used by private companies and public institutions responsible of the indoor facilities management and also may be integrated with other applications focused on indoor navigation. MDPI 2021-02-05 /pmc/articles/PMC7915880/ /pubmed/33562761 http://dx.doi.org/10.3390/s21041123 Text en © 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/). |
spellingShingle | Article Jurado, David Jurado, Juan M. Ortega, Lidia Feito, Francisco R. GEUINF: Real-Time Visualization of Indoor Facilities Using Mixed Reality |
title | GEUINF: Real-Time Visualization of Indoor Facilities Using Mixed Reality |
title_full | GEUINF: Real-Time Visualization of Indoor Facilities Using Mixed Reality |
title_fullStr | GEUINF: Real-Time Visualization of Indoor Facilities Using Mixed Reality |
title_full_unstemmed | GEUINF: Real-Time Visualization of Indoor Facilities Using Mixed Reality |
title_short | GEUINF: Real-Time Visualization of Indoor Facilities Using Mixed Reality |
title_sort | geuinf: real-time visualization of indoor facilities using mixed reality |
topic | Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7915880/ https://www.ncbi.nlm.nih.gov/pubmed/33562761 http://dx.doi.org/10.3390/s21041123 |
work_keys_str_mv | AT juradodavid geuinfrealtimevisualizationofindoorfacilitiesusingmixedreality AT juradojuanm geuinfrealtimevisualizationofindoorfacilitiesusingmixedreality AT ortegalidia geuinfrealtimevisualizationofindoorfacilitiesusingmixedreality AT feitofranciscor geuinfrealtimevisualizationofindoorfacilitiesusingmixedreality |