Cargando…

Computer-Aided Breast Surgery Framework Using a Markerless Augmented Reality Method

This study proposes a markerless Augmented Reality (AR) surgical framework for breast lesion removal using a depth sensor and 3D breast Computed Tomography (CT) images. A patient mesh in the real coordinate system is acquired through a patient 3D scan using a depth sensor for registration. The patie...

Descripción completa

Detalles Bibliográficos
Autores principales: Khang, Seungwoo, Park, Taeyong, Lee, Junwoo, Kim, Kyung Won, Song, Hyunjoo, Lee, Jeongjin
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9777271/
https://www.ncbi.nlm.nih.gov/pubmed/36553130
http://dx.doi.org/10.3390/diagnostics12123123
_version_ 1784856062353997824
author Khang, Seungwoo
Park, Taeyong
Lee, Junwoo
Kim, Kyung Won
Song, Hyunjoo
Lee, Jeongjin
author_facet Khang, Seungwoo
Park, Taeyong
Lee, Junwoo
Kim, Kyung Won
Song, Hyunjoo
Lee, Jeongjin
author_sort Khang, Seungwoo
collection PubMed
description This study proposes a markerless Augmented Reality (AR) surgical framework for breast lesion removal using a depth sensor and 3D breast Computed Tomography (CT) images. A patient mesh in the real coordinate system is acquired through a patient 3D scan using a depth sensor for registration. The patient mesh on the virtual coordinate system is obtained by contrast-based skin segmentation in 3D mesh generated from breast CT scans. Then, the nipple area is detected based on the gradient in the segmented skin area. The region of interest (ROI) is set based on the detection result to select the vertices in the virtual coordinate system. The mesh on the real and virtual coordinate systems is first aligned by matching the center of mass, and the Iterative Closest Point (ICP) method is applied to perform more precise registration. Experimental results of 20 patients’ data showed 98.35 ± 0.71% skin segmentation accuracy in terms of Dice Similarity Coefficient (DSC) value, 2.79 ± 1.54 mm nipple detection error, and 4.69 ± 1.95 mm registration error. Experiments using phantom and patient data also confirmed high accuracy in AR visualization. The proposed method in this study showed that the 3D AR visualization of medical data on the patient’s body is possible by using a single depth sensor without having to use markers.
format Online
Article
Text
id pubmed-9777271
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-97772712022-12-23 Computer-Aided Breast Surgery Framework Using a Markerless Augmented Reality Method Khang, Seungwoo Park, Taeyong Lee, Junwoo Kim, Kyung Won Song, Hyunjoo Lee, Jeongjin Diagnostics (Basel) Article This study proposes a markerless Augmented Reality (AR) surgical framework for breast lesion removal using a depth sensor and 3D breast Computed Tomography (CT) images. A patient mesh in the real coordinate system is acquired through a patient 3D scan using a depth sensor for registration. The patient mesh on the virtual coordinate system is obtained by contrast-based skin segmentation in 3D mesh generated from breast CT scans. Then, the nipple area is detected based on the gradient in the segmented skin area. The region of interest (ROI) is set based on the detection result to select the vertices in the virtual coordinate system. The mesh on the real and virtual coordinate systems is first aligned by matching the center of mass, and the Iterative Closest Point (ICP) method is applied to perform more precise registration. Experimental results of 20 patients’ data showed 98.35 ± 0.71% skin segmentation accuracy in terms of Dice Similarity Coefficient (DSC) value, 2.79 ± 1.54 mm nipple detection error, and 4.69 ± 1.95 mm registration error. Experiments using phantom and patient data also confirmed high accuracy in AR visualization. The proposed method in this study showed that the 3D AR visualization of medical data on the patient’s body is possible by using a single depth sensor without having to use markers. MDPI 2022-12-11 /pmc/articles/PMC9777271/ /pubmed/36553130 http://dx.doi.org/10.3390/diagnostics12123123 Text en © 2022 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Khang, Seungwoo
Park, Taeyong
Lee, Junwoo
Kim, Kyung Won
Song, Hyunjoo
Lee, Jeongjin
Computer-Aided Breast Surgery Framework Using a Markerless Augmented Reality Method
title Computer-Aided Breast Surgery Framework Using a Markerless Augmented Reality Method
title_full Computer-Aided Breast Surgery Framework Using a Markerless Augmented Reality Method
title_fullStr Computer-Aided Breast Surgery Framework Using a Markerless Augmented Reality Method
title_full_unstemmed Computer-Aided Breast Surgery Framework Using a Markerless Augmented Reality Method
title_short Computer-Aided Breast Surgery Framework Using a Markerless Augmented Reality Method
title_sort computer-aided breast surgery framework using a markerless augmented reality method
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9777271/
https://www.ncbi.nlm.nih.gov/pubmed/36553130
http://dx.doi.org/10.3390/diagnostics12123123
work_keys_str_mv AT khangseungwoo computeraidedbreastsurgeryframeworkusingamarkerlessaugmentedrealitymethod
AT parktaeyong computeraidedbreastsurgeryframeworkusingamarkerlessaugmentedrealitymethod
AT leejunwoo computeraidedbreastsurgeryframeworkusingamarkerlessaugmentedrealitymethod
AT kimkyungwon computeraidedbreastsurgeryframeworkusingamarkerlessaugmentedrealitymethod
AT songhyunjoo computeraidedbreastsurgeryframeworkusingamarkerlessaugmentedrealitymethod
AT leejeongjin computeraidedbreastsurgeryframeworkusingamarkerlessaugmentedrealitymethod