Cargando…

Robust detection and tracking of annotations for outdoor augmented reality browsing

A common goal of outdoor augmented reality (AR) is the presentation of annotations that are registered to anchor points in the real world. We present an enhanced approach for registering and tracking such anchor points, which is suitable for current generation mobile phones and can also successfully...

Descripción completa

Detalles Bibliográficos
Autores principales: Langlotz, Tobias, Degendorfer, Claus, Mulloni, Alessandro, Schall, Gerhard, Reitmayr, Gerhard, Schmalstieg, Dieter
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Pergamon Press 2011
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3149669/
https://www.ncbi.nlm.nih.gov/pubmed/21976781
http://dx.doi.org/10.1016/j.cag.2011.04.004
_version_ 1782209486116618240
author Langlotz, Tobias
Degendorfer, Claus
Mulloni, Alessandro
Schall, Gerhard
Reitmayr, Gerhard
Schmalstieg, Dieter
author_facet Langlotz, Tobias
Degendorfer, Claus
Mulloni, Alessandro
Schall, Gerhard
Reitmayr, Gerhard
Schmalstieg, Dieter
author_sort Langlotz, Tobias
collection PubMed
description A common goal of outdoor augmented reality (AR) is the presentation of annotations that are registered to anchor points in the real world. We present an enhanced approach for registering and tracking such anchor points, which is suitable for current generation mobile phones and can also successfully deal with the wide variety of viewing conditions encountered in real life outdoor use. The approach is based on on-the-fly generation of panoramic images by sweeping the camera over the scene. The panoramas are then used for stable orientation tracking, while the user is performing only rotational movements. This basic approach is improved by several new techniques for the re-detection and tracking of anchor points. For the re-detection, specifically after temporal variations, we first compute a panoramic image with extended dynamic range, which can better represent varying illumination conditions. The panorama is then searched for known anchor points, while orientation tracking continues uninterrupted. We then use information from an internal orientation sensor to prime an active search scheme for the anchor points, which improves matching results. Finally, global consistency is enhanced by statistical estimation of a global rotation that minimizes the overall position error of anchor points when transforming them from the source panorama in which they were created, to the current view represented by a new panorama. Once the anchor points are redetected, we track the user's movement using a novel 3-degree-of-freedom orientation tracking approach that combines vision tracking with the absolute orientation from inertial and magnetic sensors. We tested our system using an AR campus guide as an example application and provide detailed results for our approach using an off-the-shelf smartphone. Results show that the re-detection rate is improved by a factor of 2 compared to previous work and reaches almost 90% for a wide variety of test cases while still keeping the ability to run at interactive frame rates.
format Online
Article
Text
id pubmed-3149669
institution National Center for Biotechnology Information
language English
publishDate 2011
publisher Pergamon Press
record_format MEDLINE/PubMed
spelling pubmed-31496692011-10-03 Robust detection and tracking of annotations for outdoor augmented reality browsing Langlotz, Tobias Degendorfer, Claus Mulloni, Alessandro Schall, Gerhard Reitmayr, Gerhard Schmalstieg, Dieter Comput Graph Mobile Augmented Reality A common goal of outdoor augmented reality (AR) is the presentation of annotations that are registered to anchor points in the real world. We present an enhanced approach for registering and tracking such anchor points, which is suitable for current generation mobile phones and can also successfully deal with the wide variety of viewing conditions encountered in real life outdoor use. The approach is based on on-the-fly generation of panoramic images by sweeping the camera over the scene. The panoramas are then used for stable orientation tracking, while the user is performing only rotational movements. This basic approach is improved by several new techniques for the re-detection and tracking of anchor points. For the re-detection, specifically after temporal variations, we first compute a panoramic image with extended dynamic range, which can better represent varying illumination conditions. The panorama is then searched for known anchor points, while orientation tracking continues uninterrupted. We then use information from an internal orientation sensor to prime an active search scheme for the anchor points, which improves matching results. Finally, global consistency is enhanced by statistical estimation of a global rotation that minimizes the overall position error of anchor points when transforming them from the source panorama in which they were created, to the current view represented by a new panorama. Once the anchor points are redetected, we track the user's movement using a novel 3-degree-of-freedom orientation tracking approach that combines vision tracking with the absolute orientation from inertial and magnetic sensors. We tested our system using an AR campus guide as an example application and provide detailed results for our approach using an off-the-shelf smartphone. Results show that the re-detection rate is improved by a factor of 2 compared to previous work and reaches almost 90% for a wide variety of test cases while still keeping the ability to run at interactive frame rates. Pergamon Press 2011-08 /pmc/articles/PMC3149669/ /pubmed/21976781 http://dx.doi.org/10.1016/j.cag.2011.04.004 Text en © 2011 Elsevier Ltd. https://creativecommons.org/licenses/by-nc-nd/3.0/ Open Access under CC BY-NC-ND 3.0 (https://creativecommons.org/licenses/by-nc-nd/3.0/) license
spellingShingle Mobile Augmented Reality
Langlotz, Tobias
Degendorfer, Claus
Mulloni, Alessandro
Schall, Gerhard
Reitmayr, Gerhard
Schmalstieg, Dieter
Robust detection and tracking of annotations for outdoor augmented reality browsing
title Robust detection and tracking of annotations for outdoor augmented reality browsing
title_full Robust detection and tracking of annotations for outdoor augmented reality browsing
title_fullStr Robust detection and tracking of annotations for outdoor augmented reality browsing
title_full_unstemmed Robust detection and tracking of annotations for outdoor augmented reality browsing
title_short Robust detection and tracking of annotations for outdoor augmented reality browsing
title_sort robust detection and tracking of annotations for outdoor augmented reality browsing
topic Mobile Augmented Reality
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3149669/
https://www.ncbi.nlm.nih.gov/pubmed/21976781
http://dx.doi.org/10.1016/j.cag.2011.04.004
work_keys_str_mv AT langlotztobias robustdetectionandtrackingofannotationsforoutdooraugmentedrealitybrowsing
AT degendorferclaus robustdetectionandtrackingofannotationsforoutdooraugmentedrealitybrowsing
AT mullonialessandro robustdetectionandtrackingofannotationsforoutdooraugmentedrealitybrowsing
AT schallgerhard robustdetectionandtrackingofannotationsforoutdooraugmentedrealitybrowsing
AT reitmayrgerhard robustdetectionandtrackingofannotationsforoutdooraugmentedrealitybrowsing
AT schmalstiegdieter robustdetectionandtrackingofannotationsforoutdooraugmentedrealitybrowsing