Cargando…

Weakly supervised segmentation for real-time surgical tool tracking

Surgical tool tracking has a variety of applications in different surgical scenarios. Electromagnetic (EM) tracking can be utilised for tool tracking, but the accuracy is often limited by magnetic interference. Vision-based methods have also been suggested; however, tracking robustness is limited by...

Descripción completa

Detalles Bibliográficos
Autores principales: Lee, Eung-Joo, Plishker, William, Liu, Xinyang, Bhattacharyya, Shuvra S., Shekhar, Raj
Formato: Online Artículo Texto
Lenguaje:English
Publicado: The Institution of Engineering and Technology 2019
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6952260/
https://www.ncbi.nlm.nih.gov/pubmed/32038863
http://dx.doi.org/10.1049/htl.2019.0083
Descripción
Sumario:Surgical tool tracking has a variety of applications in different surgical scenarios. Electromagnetic (EM) tracking can be utilised for tool tracking, but the accuracy is often limited by magnetic interference. Vision-based methods have also been suggested; however, tracking robustness is limited by specular reflection, occlusions, and blurriness observed in the endoscopic image. Recently, deep learning-based methods have shown competitive performance on segmentation and tracking of surgical tools. The main bottleneck of these methods lies in acquiring a sufficient amount of pixel-wise, annotated training data, which demands substantial labour costs. To tackle this issue, the authors propose a weakly supervised method for surgical tool segmentation and tracking based on hybrid sensor systems. They first generate semantic labellings using EM tracking and laparoscopic image processing concurrently. They then train a light-weight deep segmentation network to obtain a binary segmentation mask that enables tool tracking. To the authors’ knowledge, the proposed method is the first to integrate EM tracking and laparoscopic image processing for generation of training labels. They demonstrate that their framework achieves accurate, automatic tool segmentation (i.e. without any manual labelling of the surgical tool to be tracked) and robust tool tracking in laparoscopic image sequences.