Cargando…

Learned optical flow for intra-operative tracking of the retinal fundus

PURPOSE: Sustained delivery of regenerative retinal therapies by robotic systems requires intra-operative tracking of the retinal fundus. We propose a supervised deep convolutional neural network to densely predict semantic segmentation and optical flow of the retina as mutually supportive tasks, im...

Descripción completa

Detalles Bibliográficos
Autores principales: Ravasio, Claudio S., Pissas, Theodoros, Bloch, Edward, Flores, Blanca, Jalali, Sepehr, Stoyanov, Danail, Cardoso, Jorge M., Da Cruz, Lyndon, Bergeles, Christos
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Springer International Publishing 2020
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7261285/
https://www.ncbi.nlm.nih.gov/pubmed/32323210
http://dx.doi.org/10.1007/s11548-020-02160-9
_version_ 1783540476257763328
author Ravasio, Claudio S.
Pissas, Theodoros
Bloch, Edward
Flores, Blanca
Jalali, Sepehr
Stoyanov, Danail
Cardoso, Jorge M.
Da Cruz, Lyndon
Bergeles, Christos
author_facet Ravasio, Claudio S.
Pissas, Theodoros
Bloch, Edward
Flores, Blanca
Jalali, Sepehr
Stoyanov, Danail
Cardoso, Jorge M.
Da Cruz, Lyndon
Bergeles, Christos
author_sort Ravasio, Claudio S.
collection PubMed
description PURPOSE: Sustained delivery of regenerative retinal therapies by robotic systems requires intra-operative tracking of the retinal fundus. We propose a supervised deep convolutional neural network to densely predict semantic segmentation and optical flow of the retina as mutually supportive tasks, implicitly inpainting retinal flow information missing due to occlusion by surgical tools. METHODS: As manual annotation of optical flow is infeasible, we propose a flexible algorithm for generation of large synthetic training datasets on the basis of given intra-operative retinal images. We evaluate optical flow estimation by tracking a grid and sparsely annotated ground truth points on a benchmark of challenging real intra-operative clips obtained from an extensive internally acquired dataset encompassing representative vitreoretinal surgical cases. RESULTS: The U-Net-based network trained on the synthetic dataset is shown to generalise well to the benchmark of real surgical videos. When used to track retinal points of interest, our flow estimation outperforms variational baseline methods on clips containing tool motions which occlude the points of interest, as is routinely observed in intra-operatively recorded surgery videos. CONCLUSIONS: The results indicate that complex synthetic training datasets can be used to specifically guide optical flow estimation. Our proposed algorithm therefore lays the foundation for a robust system which can assist with intra-operative tracking of moving surgical targets even when occluded.
format Online
Article
Text
id pubmed-7261285
institution National Center for Biotechnology Information
language English
publishDate 2020
publisher Springer International Publishing
record_format MEDLINE/PubMed
spelling pubmed-72612852020-06-10 Learned optical flow for intra-operative tracking of the retinal fundus Ravasio, Claudio S. Pissas, Theodoros Bloch, Edward Flores, Blanca Jalali, Sepehr Stoyanov, Danail Cardoso, Jorge M. Da Cruz, Lyndon Bergeles, Christos Int J Comput Assist Radiol Surg Original Article PURPOSE: Sustained delivery of regenerative retinal therapies by robotic systems requires intra-operative tracking of the retinal fundus. We propose a supervised deep convolutional neural network to densely predict semantic segmentation and optical flow of the retina as mutually supportive tasks, implicitly inpainting retinal flow information missing due to occlusion by surgical tools. METHODS: As manual annotation of optical flow is infeasible, we propose a flexible algorithm for generation of large synthetic training datasets on the basis of given intra-operative retinal images. We evaluate optical flow estimation by tracking a grid and sparsely annotated ground truth points on a benchmark of challenging real intra-operative clips obtained from an extensive internally acquired dataset encompassing representative vitreoretinal surgical cases. RESULTS: The U-Net-based network trained on the synthetic dataset is shown to generalise well to the benchmark of real surgical videos. When used to track retinal points of interest, our flow estimation outperforms variational baseline methods on clips containing tool motions which occlude the points of interest, as is routinely observed in intra-operatively recorded surgery videos. CONCLUSIONS: The results indicate that complex synthetic training datasets can be used to specifically guide optical flow estimation. Our proposed algorithm therefore lays the foundation for a robust system which can assist with intra-operative tracking of moving surgical targets even when occluded. Springer International Publishing 2020-04-22 2020 /pmc/articles/PMC7261285/ /pubmed/32323210 http://dx.doi.org/10.1007/s11548-020-02160-9 Text en © The Author(s) 2020 Open AccessThis article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
spellingShingle Original Article
Ravasio, Claudio S.
Pissas, Theodoros
Bloch, Edward
Flores, Blanca
Jalali, Sepehr
Stoyanov, Danail
Cardoso, Jorge M.
Da Cruz, Lyndon
Bergeles, Christos
Learned optical flow for intra-operative tracking of the retinal fundus
title Learned optical flow for intra-operative tracking of the retinal fundus
title_full Learned optical flow for intra-operative tracking of the retinal fundus
title_fullStr Learned optical flow for intra-operative tracking of the retinal fundus
title_full_unstemmed Learned optical flow for intra-operative tracking of the retinal fundus
title_short Learned optical flow for intra-operative tracking of the retinal fundus
title_sort learned optical flow for intra-operative tracking of the retinal fundus
topic Original Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7261285/
https://www.ncbi.nlm.nih.gov/pubmed/32323210
http://dx.doi.org/10.1007/s11548-020-02160-9
work_keys_str_mv AT ravasioclaudios learnedopticalflowforintraoperativetrackingoftheretinalfundus
AT pissastheodoros learnedopticalflowforintraoperativetrackingoftheretinalfundus
AT blochedward learnedopticalflowforintraoperativetrackingoftheretinalfundus
AT floresblanca learnedopticalflowforintraoperativetrackingoftheretinalfundus
AT jalalisepehr learnedopticalflowforintraoperativetrackingoftheretinalfundus
AT stoyanovdanail learnedopticalflowforintraoperativetrackingoftheretinalfundus
AT cardosojorgem learnedopticalflowforintraoperativetrackingoftheretinalfundus
AT dacruzlyndon learnedopticalflowforintraoperativetrackingoftheretinalfundus
AT bergeleschristos learnedopticalflowforintraoperativetrackingoftheretinalfundus