Cargando…

Weakly supervised segmentation for real-time surgical tool tracking

Surgical tool tracking has a variety of applications in different surgical scenarios. Electromagnetic (EM) tracking can be utilised for tool tracking, but the accuracy is often limited by magnetic interference. Vision-based methods have also been suggested; however, tracking robustness is limited by...

Descripción completa

Detalles Bibliográficos
Autores principales: Lee, Eung-Joo, Plishker, William, Liu, Xinyang, Bhattacharyya, Shuvra S., Shekhar, Raj
Formato: Online Artículo Texto
Lenguaje:English
Publicado: The Institution of Engineering and Technology 2019
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6952260/
https://www.ncbi.nlm.nih.gov/pubmed/32038863
http://dx.doi.org/10.1049/htl.2019.0083
_version_ 1783486413020332032
author Lee, Eung-Joo
Plishker, William
Liu, Xinyang
Bhattacharyya, Shuvra S.
Shekhar, Raj
author_facet Lee, Eung-Joo
Plishker, William
Liu, Xinyang
Bhattacharyya, Shuvra S.
Shekhar, Raj
author_sort Lee, Eung-Joo
collection PubMed
description Surgical tool tracking has a variety of applications in different surgical scenarios. Electromagnetic (EM) tracking can be utilised for tool tracking, but the accuracy is often limited by magnetic interference. Vision-based methods have also been suggested; however, tracking robustness is limited by specular reflection, occlusions, and blurriness observed in the endoscopic image. Recently, deep learning-based methods have shown competitive performance on segmentation and tracking of surgical tools. The main bottleneck of these methods lies in acquiring a sufficient amount of pixel-wise, annotated training data, which demands substantial labour costs. To tackle this issue, the authors propose a weakly supervised method for surgical tool segmentation and tracking based on hybrid sensor systems. They first generate semantic labellings using EM tracking and laparoscopic image processing concurrently. They then train a light-weight deep segmentation network to obtain a binary segmentation mask that enables tool tracking. To the authors’ knowledge, the proposed method is the first to integrate EM tracking and laparoscopic image processing for generation of training labels. They demonstrate that their framework achieves accurate, automatic tool segmentation (i.e. without any manual labelling of the surgical tool to be tracked) and robust tool tracking in laparoscopic image sequences.
format Online
Article
Text
id pubmed-6952260
institution National Center for Biotechnology Information
language English
publishDate 2019
publisher The Institution of Engineering and Technology
record_format MEDLINE/PubMed
spelling pubmed-69522602020-02-07 Weakly supervised segmentation for real-time surgical tool tracking Lee, Eung-Joo Plishker, William Liu, Xinyang Bhattacharyya, Shuvra S. Shekhar, Raj Healthc Technol Lett Special Issue: Papers from the 13th Workshop on Augmented Environments for Computer Assisted Interventions Surgical tool tracking has a variety of applications in different surgical scenarios. Electromagnetic (EM) tracking can be utilised for tool tracking, but the accuracy is often limited by magnetic interference. Vision-based methods have also been suggested; however, tracking robustness is limited by specular reflection, occlusions, and blurriness observed in the endoscopic image. Recently, deep learning-based methods have shown competitive performance on segmentation and tracking of surgical tools. The main bottleneck of these methods lies in acquiring a sufficient amount of pixel-wise, annotated training data, which demands substantial labour costs. To tackle this issue, the authors propose a weakly supervised method for surgical tool segmentation and tracking based on hybrid sensor systems. They first generate semantic labellings using EM tracking and laparoscopic image processing concurrently. They then train a light-weight deep segmentation network to obtain a binary segmentation mask that enables tool tracking. To the authors’ knowledge, the proposed method is the first to integrate EM tracking and laparoscopic image processing for generation of training labels. They demonstrate that their framework achieves accurate, automatic tool segmentation (i.e. without any manual labelling of the surgical tool to be tracked) and robust tool tracking in laparoscopic image sequences. The Institution of Engineering and Technology 2019-11-26 /pmc/articles/PMC6952260/ /pubmed/32038863 http://dx.doi.org/10.1049/htl.2019.0083 Text en http://creativecommons.org/licenses/by-nc/3.0/ This is an open access article published by the IET under the Creative Commons Attribution -NonCommercial License (http://creativecommons.org/licenses/by-nc/3.0/)
spellingShingle Special Issue: Papers from the 13th Workshop on Augmented Environments for Computer Assisted Interventions
Lee, Eung-Joo
Plishker, William
Liu, Xinyang
Bhattacharyya, Shuvra S.
Shekhar, Raj
Weakly supervised segmentation for real-time surgical tool tracking
title Weakly supervised segmentation for real-time surgical tool tracking
title_full Weakly supervised segmentation for real-time surgical tool tracking
title_fullStr Weakly supervised segmentation for real-time surgical tool tracking
title_full_unstemmed Weakly supervised segmentation for real-time surgical tool tracking
title_short Weakly supervised segmentation for real-time surgical tool tracking
title_sort weakly supervised segmentation for real-time surgical tool tracking
topic Special Issue: Papers from the 13th Workshop on Augmented Environments for Computer Assisted Interventions
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6952260/
https://www.ncbi.nlm.nih.gov/pubmed/32038863
http://dx.doi.org/10.1049/htl.2019.0083
work_keys_str_mv AT leeeungjoo weaklysupervisedsegmentationforrealtimesurgicaltooltracking
AT plishkerwilliam weaklysupervisedsegmentationforrealtimesurgicaltooltracking
AT liuxinyang weaklysupervisedsegmentationforrealtimesurgicaltooltracking
AT bhattacharyyashuvras weaklysupervisedsegmentationforrealtimesurgicaltooltracking
AT shekharraj weaklysupervisedsegmentationforrealtimesurgicaltooltracking