Cargando…

De-warping of images and improved eye tracking for the scanning laser ophthalmoscope

A limitation of scanning laser ophthalmoscopy (SLO) is that eye movements during the capture of each frame distort the retinal image. Various sophisticated strategies have been devised to ensure that each acquired frame can be mapped quickly and accurately onto a chosen reference frame, but such met...

Descripción completa

Detalles Bibliográficos
Autores principales: Bedggood, Phillip, Metha, Andrew
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Public Library of Science 2017
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5378343/
https://www.ncbi.nlm.nih.gov/pubmed/28369065
http://dx.doi.org/10.1371/journal.pone.0174617
_version_ 1782519426646540288
author Bedggood, Phillip
Metha, Andrew
author_facet Bedggood, Phillip
Metha, Andrew
author_sort Bedggood, Phillip
collection PubMed
description A limitation of scanning laser ophthalmoscopy (SLO) is that eye movements during the capture of each frame distort the retinal image. Various sophisticated strategies have been devised to ensure that each acquired frame can be mapped quickly and accurately onto a chosen reference frame, but such methods are blind to distortions in the reference frame itself. Here we explore a method to address this limitation in software, and demonstrate its accuracy. We used high-speed (200 fps), high-resolution (~1 μm), flood-based imaging of the human retina with adaptive optics to obtain “ground truth” information on the retinal image and motion of the eye. This information was used to simulate SLO video sequences at 20 fps, allowing us to compare various methods for eye-motion recovery and subsequent minimization of intra-frame distortion. We show that a) a single frame can be near-perfectly recovered with perfect knowledge of intra-frame eye motion; b) eye motion at a given time point within a frame can be accurately recovered by tracking the same strip of tissue across many frames, due to the stochastic symmetry of fixational eye movements. This approach is similar to, and easily adapted from, previously suggested strip-registration approaches; c) quality of frame recovery decreases with amplitude of eye movements, however, the proposed method is affected less by this than other state-of-the-art methods and so offers even greater advantages when fixation is poor. The new method could easily be integrated into existing image processing software, and we provide an example implementation written in Matlab.
format Online
Article
Text
id pubmed-5378343
institution National Center for Biotechnology Information
language English
publishDate 2017
publisher Public Library of Science
record_format MEDLINE/PubMed
spelling pubmed-53783432017-04-07 De-warping of images and improved eye tracking for the scanning laser ophthalmoscope Bedggood, Phillip Metha, Andrew PLoS One Research Article A limitation of scanning laser ophthalmoscopy (SLO) is that eye movements during the capture of each frame distort the retinal image. Various sophisticated strategies have been devised to ensure that each acquired frame can be mapped quickly and accurately onto a chosen reference frame, but such methods are blind to distortions in the reference frame itself. Here we explore a method to address this limitation in software, and demonstrate its accuracy. We used high-speed (200 fps), high-resolution (~1 μm), flood-based imaging of the human retina with adaptive optics to obtain “ground truth” information on the retinal image and motion of the eye. This information was used to simulate SLO video sequences at 20 fps, allowing us to compare various methods for eye-motion recovery and subsequent minimization of intra-frame distortion. We show that a) a single frame can be near-perfectly recovered with perfect knowledge of intra-frame eye motion; b) eye motion at a given time point within a frame can be accurately recovered by tracking the same strip of tissue across many frames, due to the stochastic symmetry of fixational eye movements. This approach is similar to, and easily adapted from, previously suggested strip-registration approaches; c) quality of frame recovery decreases with amplitude of eye movements, however, the proposed method is affected less by this than other state-of-the-art methods and so offers even greater advantages when fixation is poor. The new method could easily be integrated into existing image processing software, and we provide an example implementation written in Matlab. Public Library of Science 2017-04-03 /pmc/articles/PMC5378343/ /pubmed/28369065 http://dx.doi.org/10.1371/journal.pone.0174617 Text en © 2017 Bedggood, Metha http://creativecommons.org/licenses/by/4.0/ This is an open access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0/) , which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
spellingShingle Research Article
Bedggood, Phillip
Metha, Andrew
De-warping of images and improved eye tracking for the scanning laser ophthalmoscope
title De-warping of images and improved eye tracking for the scanning laser ophthalmoscope
title_full De-warping of images and improved eye tracking for the scanning laser ophthalmoscope
title_fullStr De-warping of images and improved eye tracking for the scanning laser ophthalmoscope
title_full_unstemmed De-warping of images and improved eye tracking for the scanning laser ophthalmoscope
title_short De-warping of images and improved eye tracking for the scanning laser ophthalmoscope
title_sort de-warping of images and improved eye tracking for the scanning laser ophthalmoscope
topic Research Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5378343/
https://www.ncbi.nlm.nih.gov/pubmed/28369065
http://dx.doi.org/10.1371/journal.pone.0174617
work_keys_str_mv AT bedggoodphillip dewarpingofimagesandimprovedeyetrackingforthescanninglaserophthalmoscope
AT methaandrew dewarpingofimagesandimprovedeyetrackingforthescanninglaserophthalmoscope