Cargando…

Video-based augmented reality combining CT-scan and instrument position data to microscope view in middle ear surgery

The aim of the study was to develop and assess the performance of a video-based augmented reality system, combining preoperative computed tomography (CT) and real-time microscopic video, as the first crucial step to keyhole middle ear procedures through a tympanic membrane puncture. Six different ar...

Descripción completa

Detalles Bibliográficos
Autores principales: Hussain, Raabid, Lalande, Alain, Marroquin, Roberto, Guigou, Caroline, Bozorg Grayeli, Alexis
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Nature Publishing Group UK 2020
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7174368/
https://www.ncbi.nlm.nih.gov/pubmed/32317726
http://dx.doi.org/10.1038/s41598-020-63839-2
_version_ 1783524623343681536
author Hussain, Raabid
Lalande, Alain
Marroquin, Roberto
Guigou, Caroline
Bozorg Grayeli, Alexis
author_facet Hussain, Raabid
Lalande, Alain
Marroquin, Roberto
Guigou, Caroline
Bozorg Grayeli, Alexis
author_sort Hussain, Raabid
collection PubMed
description The aim of the study was to develop and assess the performance of a video-based augmented reality system, combining preoperative computed tomography (CT) and real-time microscopic video, as the first crucial step to keyhole middle ear procedures through a tympanic membrane puncture. Six different artificial human temporal bones were included in this prospective study. Six stainless steel fiducial markers were glued on the periphery of the eardrum, and a high-resolution CT-scan of the temporal bone was obtained. Virtual endoscopy of the middle ear based on this CT-scan was conducted on Osirix software. Virtual endoscopy image was registered to the microscope-based video of the intact tympanic membrane based on fiducial markers and a homography transformation was applied during microscope movements. These movements were tracked using Speeded-Up Robust Features (SURF) method. Simultaneously, a micro-surgical instrument was identified and tracked using a Kalman filter. The 3D position of the instrument was extracted by solving a three-point perspective framework. For evaluation, the instrument was introduced through the tympanic membrane and ink droplets were injected on three middle ear structures. An average initial registration accuracy of 0.21 ± 0.10 mm (n = 3) was achieved with a slow propagation error during tracking (0.04 ± 0.07 mm). The estimated surgical instrument tip position error was 0.33 ± 0.22 mm. The target structures’ localization accuracy was 0.52 ± 0.15 mm. The submillimetric accuracy of our system without tracker is compatible with ear surgery.
format Online
Article
Text
id pubmed-7174368
institution National Center for Biotechnology Information
language English
publishDate 2020
publisher Nature Publishing Group UK
record_format MEDLINE/PubMed
spelling pubmed-71743682020-04-24 Video-based augmented reality combining CT-scan and instrument position data to microscope view in middle ear surgery Hussain, Raabid Lalande, Alain Marroquin, Roberto Guigou, Caroline Bozorg Grayeli, Alexis Sci Rep Article The aim of the study was to develop and assess the performance of a video-based augmented reality system, combining preoperative computed tomography (CT) and real-time microscopic video, as the first crucial step to keyhole middle ear procedures through a tympanic membrane puncture. Six different artificial human temporal bones were included in this prospective study. Six stainless steel fiducial markers were glued on the periphery of the eardrum, and a high-resolution CT-scan of the temporal bone was obtained. Virtual endoscopy of the middle ear based on this CT-scan was conducted on Osirix software. Virtual endoscopy image was registered to the microscope-based video of the intact tympanic membrane based on fiducial markers and a homography transformation was applied during microscope movements. These movements were tracked using Speeded-Up Robust Features (SURF) method. Simultaneously, a micro-surgical instrument was identified and tracked using a Kalman filter. The 3D position of the instrument was extracted by solving a three-point perspective framework. For evaluation, the instrument was introduced through the tympanic membrane and ink droplets were injected on three middle ear structures. An average initial registration accuracy of 0.21 ± 0.10 mm (n = 3) was achieved with a slow propagation error during tracking (0.04 ± 0.07 mm). The estimated surgical instrument tip position error was 0.33 ± 0.22 mm. The target structures’ localization accuracy was 0.52 ± 0.15 mm. The submillimetric accuracy of our system without tracker is compatible with ear surgery. Nature Publishing Group UK 2020-04-21 /pmc/articles/PMC7174368/ /pubmed/32317726 http://dx.doi.org/10.1038/s41598-020-63839-2 Text en © The Author(s) 2020 Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/.
spellingShingle Article
Hussain, Raabid
Lalande, Alain
Marroquin, Roberto
Guigou, Caroline
Bozorg Grayeli, Alexis
Video-based augmented reality combining CT-scan and instrument position data to microscope view in middle ear surgery
title Video-based augmented reality combining CT-scan and instrument position data to microscope view in middle ear surgery
title_full Video-based augmented reality combining CT-scan and instrument position data to microscope view in middle ear surgery
title_fullStr Video-based augmented reality combining CT-scan and instrument position data to microscope view in middle ear surgery
title_full_unstemmed Video-based augmented reality combining CT-scan and instrument position data to microscope view in middle ear surgery
title_short Video-based augmented reality combining CT-scan and instrument position data to microscope view in middle ear surgery
title_sort video-based augmented reality combining ct-scan and instrument position data to microscope view in middle ear surgery
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7174368/
https://www.ncbi.nlm.nih.gov/pubmed/32317726
http://dx.doi.org/10.1038/s41598-020-63839-2
work_keys_str_mv AT hussainraabid videobasedaugmentedrealitycombiningctscanandinstrumentpositiondatatomicroscopeviewinmiddleearsurgery
AT lalandealain videobasedaugmentedrealitycombiningctscanandinstrumentpositiondatatomicroscopeviewinmiddleearsurgery
AT marroquinroberto videobasedaugmentedrealitycombiningctscanandinstrumentpositiondatatomicroscopeviewinmiddleearsurgery
AT guigoucaroline videobasedaugmentedrealitycombiningctscanandinstrumentpositiondatatomicroscopeviewinmiddleearsurgery
AT bozorggrayelialexis videobasedaugmentedrealitycombiningctscanandinstrumentpositiondatatomicroscopeviewinmiddleearsurgery