Cargando…

Bridging the simulation-to-real gap for AI-based needle and target detection in robot-assisted ultrasound-guided interventions

BACKGROUND: Artificial intelligence (AI)-powered, robot-assisted, and ultrasound (US)-guided interventional radiology has the potential to increase the efficacy and cost-efficiency of interventional procedures while improving postsurgical outcomes and reducing the burden for medical personnel. METHO...

Descripción completa

Detalles Bibliográficos
Autores principales: Arapi, Visar, Hardt-Stremayr, Alexander, Weiss, Stephan, Steinbrener, Jan
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Springer Vienna 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10277269/
https://www.ncbi.nlm.nih.gov/pubmed/37332035
http://dx.doi.org/10.1186/s41747-023-00344-x
_version_ 1785060243465568256
author Arapi, Visar
Hardt-Stremayr, Alexander
Weiss, Stephan
Steinbrener, Jan
author_facet Arapi, Visar
Hardt-Stremayr, Alexander
Weiss, Stephan
Steinbrener, Jan
author_sort Arapi, Visar
collection PubMed
description BACKGROUND: Artificial intelligence (AI)-powered, robot-assisted, and ultrasound (US)-guided interventional radiology has the potential to increase the efficacy and cost-efficiency of interventional procedures while improving postsurgical outcomes and reducing the burden for medical personnel. METHODS: To overcome the lack of available clinical data needed to train state-of-the-art AI models, we propose a novel approach for generating synthetic ultrasound data from real, clinical preoperative three-dimensional (3D) data of different imaging modalities. With the synthetic data, we trained a deep learning-based detection algorithm for the localization of needle tip and target anatomy in US images. We validated our models on real, in vitro US data. RESULTS: The resulting models generalize well to unseen synthetic data and experimental in vitro data making the proposed approach a promising method to create AI-based models for applications of needle and target detection in minimally invasive US-guided procedures. Moreover, we show that by one-time calibration of the US and robot coordinate frames, our tracking algorithm can be used to accurately fine-position the robot in reach of the target based on 2D US images alone. CONCLUSIONS: The proposed data generation approach is sufficient to bridge the simulation-to-real gap and has the potential to overcome data paucity challenges in interventional radiology. The proposed AI-based detection algorithm shows very promising results in terms of accuracy and frame rate. RELEVANCE STATEMENT: This approach can facilitate the development of next-generation AI algorithms for patient anatomy detection and needle tracking in US and their application to robotics. KEY POINTS: • AI-based methods show promise for needle and target detection in US-guided interventions. • Publicly available, annotated datasets for training AI models are limited. • Synthetic, clinical-like US data can be generated from magnetic resonance or computed tomography data. • Models trained with synthetic US data generalize well to real in vitro US data. • Target detection with an AI model can be used for fine positioning of the robot. GRAPHICAL ABSTRACT: [Image: see text]
format Online
Article
Text
id pubmed-10277269
institution National Center for Biotechnology Information
language English
publishDate 2023
publisher Springer Vienna
record_format MEDLINE/PubMed
spelling pubmed-102772692023-06-20 Bridging the simulation-to-real gap for AI-based needle and target detection in robot-assisted ultrasound-guided interventions Arapi, Visar Hardt-Stremayr, Alexander Weiss, Stephan Steinbrener, Jan Eur Radiol Exp Original Article BACKGROUND: Artificial intelligence (AI)-powered, robot-assisted, and ultrasound (US)-guided interventional radiology has the potential to increase the efficacy and cost-efficiency of interventional procedures while improving postsurgical outcomes and reducing the burden for medical personnel. METHODS: To overcome the lack of available clinical data needed to train state-of-the-art AI models, we propose a novel approach for generating synthetic ultrasound data from real, clinical preoperative three-dimensional (3D) data of different imaging modalities. With the synthetic data, we trained a deep learning-based detection algorithm for the localization of needle tip and target anatomy in US images. We validated our models on real, in vitro US data. RESULTS: The resulting models generalize well to unseen synthetic data and experimental in vitro data making the proposed approach a promising method to create AI-based models for applications of needle and target detection in minimally invasive US-guided procedures. Moreover, we show that by one-time calibration of the US and robot coordinate frames, our tracking algorithm can be used to accurately fine-position the robot in reach of the target based on 2D US images alone. CONCLUSIONS: The proposed data generation approach is sufficient to bridge the simulation-to-real gap and has the potential to overcome data paucity challenges in interventional radiology. The proposed AI-based detection algorithm shows very promising results in terms of accuracy and frame rate. RELEVANCE STATEMENT: This approach can facilitate the development of next-generation AI algorithms for patient anatomy detection and needle tracking in US and their application to robotics. KEY POINTS: • AI-based methods show promise for needle and target detection in US-guided interventions. • Publicly available, annotated datasets for training AI models are limited. • Synthetic, clinical-like US data can be generated from magnetic resonance or computed tomography data. • Models trained with synthetic US data generalize well to real in vitro US data. • Target detection with an AI model can be used for fine positioning of the robot. GRAPHICAL ABSTRACT: [Image: see text] Springer Vienna 2023-06-19 /pmc/articles/PMC10277269/ /pubmed/37332035 http://dx.doi.org/10.1186/s41747-023-00344-x Text en © The Author(s) 2023 https://creativecommons.org/licenses/by/4.0/Open AccessThis article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) .
spellingShingle Original Article
Arapi, Visar
Hardt-Stremayr, Alexander
Weiss, Stephan
Steinbrener, Jan
Bridging the simulation-to-real gap for AI-based needle and target detection in robot-assisted ultrasound-guided interventions
title Bridging the simulation-to-real gap for AI-based needle and target detection in robot-assisted ultrasound-guided interventions
title_full Bridging the simulation-to-real gap for AI-based needle and target detection in robot-assisted ultrasound-guided interventions
title_fullStr Bridging the simulation-to-real gap for AI-based needle and target detection in robot-assisted ultrasound-guided interventions
title_full_unstemmed Bridging the simulation-to-real gap for AI-based needle and target detection in robot-assisted ultrasound-guided interventions
title_short Bridging the simulation-to-real gap for AI-based needle and target detection in robot-assisted ultrasound-guided interventions
title_sort bridging the simulation-to-real gap for ai-based needle and target detection in robot-assisted ultrasound-guided interventions
topic Original Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10277269/
https://www.ncbi.nlm.nih.gov/pubmed/37332035
http://dx.doi.org/10.1186/s41747-023-00344-x
work_keys_str_mv AT arapivisar bridgingthesimulationtorealgapforaibasedneedleandtargetdetectioninrobotassistedultrasoundguidedinterventions
AT hardtstremayralexander bridgingthesimulationtorealgapforaibasedneedleandtargetdetectioninrobotassistedultrasoundguidedinterventions
AT weissstephan bridgingthesimulationtorealgapforaibasedneedleandtargetdetectioninrobotassistedultrasoundguidedinterventions
AT steinbrenerjan bridgingthesimulationtorealgapforaibasedneedleandtargetdetectioninrobotassistedultrasoundguidedinterventions