Cargando…

Robust and semantic needle detection in 3D ultrasound using orthogonal-plane convolutional neural networks

PURPOSE: During needle interventions, successful automated detection of the needle immediately after insertion is necessary to allow the physician identify and correct any misalignment of the needle and the target at early stages, which reduces needle passes and improves health outcomes. METHODS: We...

Descripción completa

Detalles Bibliográficos
Autores principales: Pourtaherian, Arash, Ghazvinian Zanjani, Farhad, Zinger, Svitlana, Mihajlovic, Nenad, Ng, Gary C., Korsten, Hendrikus H. M., de With, Peter H. N.
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Springer International Publishing 2018
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6132402/
https://www.ncbi.nlm.nih.gov/pubmed/29855770
http://dx.doi.org/10.1007/s11548-018-1798-3
_version_ 1783354313638150144
author Pourtaherian, Arash
Ghazvinian Zanjani, Farhad
Zinger, Svitlana
Mihajlovic, Nenad
Ng, Gary C.
Korsten, Hendrikus H. M.
de With, Peter H. N.
author_facet Pourtaherian, Arash
Ghazvinian Zanjani, Farhad
Zinger, Svitlana
Mihajlovic, Nenad
Ng, Gary C.
Korsten, Hendrikus H. M.
de With, Peter H. N.
author_sort Pourtaherian, Arash
collection PubMed
description PURPOSE: During needle interventions, successful automated detection of the needle immediately after insertion is necessary to allow the physician identify and correct any misalignment of the needle and the target at early stages, which reduces needle passes and improves health outcomes. METHODS: We present a novel approach to localize partially inserted needles in 3D ultrasound volume with high precision using convolutional neural networks. We propose two methods based on patch classification and semantic segmentation of the needle from orthogonal 2D cross-sections extracted from the volume. For patch classification, each voxel is classified from locally extracted raw data of three orthogonal planes centered on it. We propose a bootstrap resampling approach to enhance the training in our highly imbalanced data. For semantic segmentation, parts of a needle are detected in cross-sections perpendicular to the lateral and elevational axes. We propose to exploit the structural information in the data with a novel thick-slice processing approach for efficient modeling of the context. RESULTS: Our introduced methods successfully detect 17 and 22 G needles with a single trained network, showing a robust generalized approach. Extensive ex-vivo evaluations on datasets of chicken breast and porcine leg show 80 and 84% F1-scores, respectively. Furthermore, very short needles are detected with tip localization errors of less than 0.7 mm for lengths of only 5 and 10 mm at 0.2 and 0.36 mm voxel sizes, respectively. CONCLUSION: Our method is able to accurately detect even very short needles, ensuring that the needle and its tip are maximally visible in the visualized plane during the entire intervention, thereby eliminating the need for advanced bi-manual coordination of the needle and transducer.
format Online
Article
Text
id pubmed-6132402
institution National Center for Biotechnology Information
language English
publishDate 2018
publisher Springer International Publishing
record_format MEDLINE/PubMed
spelling pubmed-61324022018-09-14 Robust and semantic needle detection in 3D ultrasound using orthogonal-plane convolutional neural networks Pourtaherian, Arash Ghazvinian Zanjani, Farhad Zinger, Svitlana Mihajlovic, Nenad Ng, Gary C. Korsten, Hendrikus H. M. de With, Peter H. N. Int J Comput Assist Radiol Surg Original Article PURPOSE: During needle interventions, successful automated detection of the needle immediately after insertion is necessary to allow the physician identify and correct any misalignment of the needle and the target at early stages, which reduces needle passes and improves health outcomes. METHODS: We present a novel approach to localize partially inserted needles in 3D ultrasound volume with high precision using convolutional neural networks. We propose two methods based on patch classification and semantic segmentation of the needle from orthogonal 2D cross-sections extracted from the volume. For patch classification, each voxel is classified from locally extracted raw data of three orthogonal planes centered on it. We propose a bootstrap resampling approach to enhance the training in our highly imbalanced data. For semantic segmentation, parts of a needle are detected in cross-sections perpendicular to the lateral and elevational axes. We propose to exploit the structural information in the data with a novel thick-slice processing approach for efficient modeling of the context. RESULTS: Our introduced methods successfully detect 17 and 22 G needles with a single trained network, showing a robust generalized approach. Extensive ex-vivo evaluations on datasets of chicken breast and porcine leg show 80 and 84% F1-scores, respectively. Furthermore, very short needles are detected with tip localization errors of less than 0.7 mm for lengths of only 5 and 10 mm at 0.2 and 0.36 mm voxel sizes, respectively. CONCLUSION: Our method is able to accurately detect even very short needles, ensuring that the needle and its tip are maximally visible in the visualized plane during the entire intervention, thereby eliminating the need for advanced bi-manual coordination of the needle and transducer. Springer International Publishing 2018-05-31 2018 /pmc/articles/PMC6132402/ /pubmed/29855770 http://dx.doi.org/10.1007/s11548-018-1798-3 Text en © The Author(s) 2018 Open AccessThis article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.
spellingShingle Original Article
Pourtaherian, Arash
Ghazvinian Zanjani, Farhad
Zinger, Svitlana
Mihajlovic, Nenad
Ng, Gary C.
Korsten, Hendrikus H. M.
de With, Peter H. N.
Robust and semantic needle detection in 3D ultrasound using orthogonal-plane convolutional neural networks
title Robust and semantic needle detection in 3D ultrasound using orthogonal-plane convolutional neural networks
title_full Robust and semantic needle detection in 3D ultrasound using orthogonal-plane convolutional neural networks
title_fullStr Robust and semantic needle detection in 3D ultrasound using orthogonal-plane convolutional neural networks
title_full_unstemmed Robust and semantic needle detection in 3D ultrasound using orthogonal-plane convolutional neural networks
title_short Robust and semantic needle detection in 3D ultrasound using orthogonal-plane convolutional neural networks
title_sort robust and semantic needle detection in 3d ultrasound using orthogonal-plane convolutional neural networks
topic Original Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6132402/
https://www.ncbi.nlm.nih.gov/pubmed/29855770
http://dx.doi.org/10.1007/s11548-018-1798-3
work_keys_str_mv AT pourtaherianarash robustandsemanticneedledetectionin3dultrasoundusingorthogonalplaneconvolutionalneuralnetworks
AT ghazvinianzanjanifarhad robustandsemanticneedledetectionin3dultrasoundusingorthogonalplaneconvolutionalneuralnetworks
AT zingersvitlana robustandsemanticneedledetectionin3dultrasoundusingorthogonalplaneconvolutionalneuralnetworks
AT mihajlovicnenad robustandsemanticneedledetectionin3dultrasoundusingorthogonalplaneconvolutionalneuralnetworks
AT nggaryc robustandsemanticneedledetectionin3dultrasoundusingorthogonalplaneconvolutionalneuralnetworks
AT korstenhendrikushm robustandsemanticneedledetectionin3dultrasoundusingorthogonalplaneconvolutionalneuralnetworks
AT dewithpeterhn robustandsemanticneedledetectionin3dultrasoundusingorthogonalplaneconvolutionalneuralnetworks