Cargando…

Low-Latency Line Tracking Using Event-Based Dynamic Vision Sensors

In order to safely navigate and orient in their local surroundings autonomous systems need to rapidly extract and persistently track visual features from the environment. While there are many algorithms tackling those tasks for traditional frame-based cameras, these have to deal with the fact that c...

Descripción completa

Detalles Bibliográficos
Autores principales: Everding, Lukas, Conradt, Jörg
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Frontiers Media S.A. 2018
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5825909/
https://www.ncbi.nlm.nih.gov/pubmed/29515386
http://dx.doi.org/10.3389/fnbot.2018.00004
_version_ 1783302253563609088
author Everding, Lukas
Conradt, Jörg
author_facet Everding, Lukas
Conradt, Jörg
author_sort Everding, Lukas
collection PubMed
description In order to safely navigate and orient in their local surroundings autonomous systems need to rapidly extract and persistently track visual features from the environment. While there are many algorithms tackling those tasks for traditional frame-based cameras, these have to deal with the fact that conventional cameras sample their environment with a fixed frequency. Most prominently, the same features have to be found in consecutive frames and corresponding features then need to be matched using elaborate techniques as any information between the two frames is lost. We introduce a novel method to detect and track line structures in data streams of event-based silicon retinae [also known as dynamic vision sensors (DVS)]. In contrast to conventional cameras, these biologically inspired sensors generate a quasicontinuous stream of vision information analogous to the information stream created by the ganglion cells in mammal retinae. All pixels of DVS operate asynchronously without a periodic sampling rate and emit a so-called DVS address event as soon as they perceive a luminance change exceeding an adjustable threshold. We use the high temporal resolution achieved by the DVS to track features continuously through time instead of only at fixed points in time. The focus of this work lies on tracking lines in a mostly static environment which is observed by a moving camera, a typical setting in mobile robotics. Since DVS events are mostly generated at object boundaries and edges which in man-made environments often form lines they were chosen as feature to track. Our method is based on detecting planes of DVS address events in x-y-t-space and tracing these planes through time. It is robust against noise and runs in real time on a standard computer, hence it is suitable for low latency robotics. The efficacy and performance are evaluated on real-world data sets which show artificial structures in an office-building using event data for tracking and frame data for ground-truth estimation from a DAVIS240C sensor.
format Online
Article
Text
id pubmed-5825909
institution National Center for Biotechnology Information
language English
publishDate 2018
publisher Frontiers Media S.A.
record_format MEDLINE/PubMed
spelling pubmed-58259092018-03-07 Low-Latency Line Tracking Using Event-Based Dynamic Vision Sensors Everding, Lukas Conradt, Jörg Front Neurorobot Neuroscience In order to safely navigate and orient in their local surroundings autonomous systems need to rapidly extract and persistently track visual features from the environment. While there are many algorithms tackling those tasks for traditional frame-based cameras, these have to deal with the fact that conventional cameras sample their environment with a fixed frequency. Most prominently, the same features have to be found in consecutive frames and corresponding features then need to be matched using elaborate techniques as any information between the two frames is lost. We introduce a novel method to detect and track line structures in data streams of event-based silicon retinae [also known as dynamic vision sensors (DVS)]. In contrast to conventional cameras, these biologically inspired sensors generate a quasicontinuous stream of vision information analogous to the information stream created by the ganglion cells in mammal retinae. All pixels of DVS operate asynchronously without a periodic sampling rate and emit a so-called DVS address event as soon as they perceive a luminance change exceeding an adjustable threshold. We use the high temporal resolution achieved by the DVS to track features continuously through time instead of only at fixed points in time. The focus of this work lies on tracking lines in a mostly static environment which is observed by a moving camera, a typical setting in mobile robotics. Since DVS events are mostly generated at object boundaries and edges which in man-made environments often form lines they were chosen as feature to track. Our method is based on detecting planes of DVS address events in x-y-t-space and tracing these planes through time. It is robust against noise and runs in real time on a standard computer, hence it is suitable for low latency robotics. The efficacy and performance are evaluated on real-world data sets which show artificial structures in an office-building using event data for tracking and frame data for ground-truth estimation from a DAVIS240C sensor. Frontiers Media S.A. 2018-02-19 /pmc/articles/PMC5825909/ /pubmed/29515386 http://dx.doi.org/10.3389/fnbot.2018.00004 Text en Copyright © 2018 Everding and Conradt. http://creativecommons.org/licenses/by/4.0/ This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
spellingShingle Neuroscience
Everding, Lukas
Conradt, Jörg
Low-Latency Line Tracking Using Event-Based Dynamic Vision Sensors
title Low-Latency Line Tracking Using Event-Based Dynamic Vision Sensors
title_full Low-Latency Line Tracking Using Event-Based Dynamic Vision Sensors
title_fullStr Low-Latency Line Tracking Using Event-Based Dynamic Vision Sensors
title_full_unstemmed Low-Latency Line Tracking Using Event-Based Dynamic Vision Sensors
title_short Low-Latency Line Tracking Using Event-Based Dynamic Vision Sensors
title_sort low-latency line tracking using event-based dynamic vision sensors
topic Neuroscience
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5825909/
https://www.ncbi.nlm.nih.gov/pubmed/29515386
http://dx.doi.org/10.3389/fnbot.2018.00004
work_keys_str_mv AT everdinglukas lowlatencylinetrackingusingeventbaseddynamicvisionsensors
AT conradtjorg lowlatencylinetrackingusingeventbaseddynamicvisionsensors