Cargando…

UltrARsound: in situ visualization of live ultrasound images using HoloLens 2

PURPOSE: Augmented Reality (AR) has the potential to simplify ultrasound (US) examinations which usually require a skilled and experienced sonographer to mentally align narrow 2D cross-sectional US images in the 3D anatomy of the patient. This work describes and evaluates a novel approach to track r...

Descripción completa

Detalles Bibliográficos
Autores principales: von Haxthausen, Felix, Moreta-Martinez, Rafael, Pose Díez de la Lastra, Alicia, Pascau, Javier, Ernst, Floris
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Springer International Publishing 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9515035/
https://www.ncbi.nlm.nih.gov/pubmed/35776399
http://dx.doi.org/10.1007/s11548-022-02695-z
_version_ 1784798402720038912
author von Haxthausen, Felix
Moreta-Martinez, Rafael
Pose Díez de la Lastra, Alicia
Pascau, Javier
Ernst, Floris
author_facet von Haxthausen, Felix
Moreta-Martinez, Rafael
Pose Díez de la Lastra, Alicia
Pascau, Javier
Ernst, Floris
author_sort von Haxthausen, Felix
collection PubMed
description PURPOSE: Augmented Reality (AR) has the potential to simplify ultrasound (US) examinations which usually require a skilled and experienced sonographer to mentally align narrow 2D cross-sectional US images in the 3D anatomy of the patient. This work describes and evaluates a novel approach to track retroreflective spheres attached to the US probe using an inside-out technique with the AR glasses HoloLens 2. Finally, live US images are displayed in situ on the imaged anatomy. METHODS: The Unity application UltrARsound performs spatial tracking of the US probe and attached retroreflective markers using the depth camera integrated into the AR glasses—thus eliminating the need for an external tracking system. Additionally, a Kalman filter is implemented to improve the noisy measurements of the camera. US images are streamed wirelessly via the PLUS toolkit to HoloLens 2. The technical evaluation comprises static and dynamic tracking accuracy, frequency and latency of displayed images. RESULTS: Tracking is performed with a median accuracy of 1.98 mm/1.81[Formula: see text] for the static setting when using the Kalman filter. In a dynamic scenario, the median error was 2.81 mm/1.70[Formula: see text] . The tracking frequency is currently limited to 20 Hz. 83% of the displayed US images had a latency lower than 16 ms. CONCLUSIONS: In this work, we showed that spatial tracking of retroreflective spheres with the depth camera of HoloLens 2 is feasible, achieving a promising accuracy for in situ visualization of live US images. For tracking, no additional hardware nor modifications to HoloLens 2 are required making it a cheap and easy-to-use approach. Moreover, a minimal latency of displayed images enables a real-time perception for the sonographer.
format Online
Article
Text
id pubmed-9515035
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher Springer International Publishing
record_format MEDLINE/PubMed
spelling pubmed-95150352022-09-29 UltrARsound: in situ visualization of live ultrasound images using HoloLens 2 von Haxthausen, Felix Moreta-Martinez, Rafael Pose Díez de la Lastra, Alicia Pascau, Javier Ernst, Floris Int J Comput Assist Radiol Surg Original Article PURPOSE: Augmented Reality (AR) has the potential to simplify ultrasound (US) examinations which usually require a skilled and experienced sonographer to mentally align narrow 2D cross-sectional US images in the 3D anatomy of the patient. This work describes and evaluates a novel approach to track retroreflective spheres attached to the US probe using an inside-out technique with the AR glasses HoloLens 2. Finally, live US images are displayed in situ on the imaged anatomy. METHODS: The Unity application UltrARsound performs spatial tracking of the US probe and attached retroreflective markers using the depth camera integrated into the AR glasses—thus eliminating the need for an external tracking system. Additionally, a Kalman filter is implemented to improve the noisy measurements of the camera. US images are streamed wirelessly via the PLUS toolkit to HoloLens 2. The technical evaluation comprises static and dynamic tracking accuracy, frequency and latency of displayed images. RESULTS: Tracking is performed with a median accuracy of 1.98 mm/1.81[Formula: see text] for the static setting when using the Kalman filter. In a dynamic scenario, the median error was 2.81 mm/1.70[Formula: see text] . The tracking frequency is currently limited to 20 Hz. 83% of the displayed US images had a latency lower than 16 ms. CONCLUSIONS: In this work, we showed that spatial tracking of retroreflective spheres with the depth camera of HoloLens 2 is feasible, achieving a promising accuracy for in situ visualization of live US images. For tracking, no additional hardware nor modifications to HoloLens 2 are required making it a cheap and easy-to-use approach. Moreover, a minimal latency of displayed images enables a real-time perception for the sonographer. Springer International Publishing 2022-07-01 2022 /pmc/articles/PMC9515035/ /pubmed/35776399 http://dx.doi.org/10.1007/s11548-022-02695-z Text en © The Author(s) 2022 https://creativecommons.org/licenses/by/4.0/Open AccessThis article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) .
spellingShingle Original Article
von Haxthausen, Felix
Moreta-Martinez, Rafael
Pose Díez de la Lastra, Alicia
Pascau, Javier
Ernst, Floris
UltrARsound: in situ visualization of live ultrasound images using HoloLens 2
title UltrARsound: in situ visualization of live ultrasound images using HoloLens 2
title_full UltrARsound: in situ visualization of live ultrasound images using HoloLens 2
title_fullStr UltrARsound: in situ visualization of live ultrasound images using HoloLens 2
title_full_unstemmed UltrARsound: in situ visualization of live ultrasound images using HoloLens 2
title_short UltrARsound: in situ visualization of live ultrasound images using HoloLens 2
title_sort ultrarsound: in situ visualization of live ultrasound images using hololens 2
topic Original Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9515035/
https://www.ncbi.nlm.nih.gov/pubmed/35776399
http://dx.doi.org/10.1007/s11548-022-02695-z
work_keys_str_mv AT vonhaxthausenfelix ultrarsoundinsituvisualizationofliveultrasoundimagesusinghololens2
AT moretamartinezrafael ultrarsoundinsituvisualizationofliveultrasoundimagesusinghololens2
AT posediezdelalastraalicia ultrarsoundinsituvisualizationofliveultrasoundimagesusinghololens2
AT pascaujavier ultrarsoundinsituvisualizationofliveultrasoundimagesusinghololens2
AT ernstfloris ultrarsoundinsituvisualizationofliveultrasoundimagesusinghololens2