Cargando…

A Saccade Based Framework for Real-Time Motion Segmentation Using Event Based Vision Sensors

Motion segmentation is a critical pre-processing step for autonomous robotic systems to facilitate tracking of moving objects in cluttered environments. Event based sensors are low power analog devices that represent a scene by means of asynchronous information updates of only the dynamic details at...

Descripción completa

Detalles Bibliográficos
Autores principales: Mishra, Abhishek, Ghosh, Rohan, Principe, Jose C., Thakor, Nitish V., Kukreja, Sunil L.
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Frontiers Media S.A. 2017
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5334512/
https://www.ncbi.nlm.nih.gov/pubmed/28316563
http://dx.doi.org/10.3389/fnins.2017.00083
_version_ 1782511860774338560
author Mishra, Abhishek
Ghosh, Rohan
Principe, Jose C.
Thakor, Nitish V.
Kukreja, Sunil L.
author_facet Mishra, Abhishek
Ghosh, Rohan
Principe, Jose C.
Thakor, Nitish V.
Kukreja, Sunil L.
author_sort Mishra, Abhishek
collection PubMed
description Motion segmentation is a critical pre-processing step for autonomous robotic systems to facilitate tracking of moving objects in cluttered environments. Event based sensors are low power analog devices that represent a scene by means of asynchronous information updates of only the dynamic details at high temporal resolution and, hence, require significantly less calculations. However, motion segmentation using spatiotemporal data is a challenging task due to data asynchrony. Prior approaches for object tracking using neuromorphic sensors perform well while the sensor is static or a known model of the object to be followed is available. To address these limitations, in this paper we develop a technique for generalized motion segmentation based on spatial statistics across time frames. First, we create micromotion on the platform to facilitate the separation of static and dynamic elements of a scene, inspired by human saccadic eye movements. Second, we introduce the concept of spike-groups as a methodology to partition spatio-temporal event groups, which facilitates computation of scene statistics and characterize objects in it. Experimental results show that our algorithm is able to classify dynamic objects with a moving camera with maximum accuracy of 92%.
format Online
Article
Text
id pubmed-5334512
institution National Center for Biotechnology Information
language English
publishDate 2017
publisher Frontiers Media S.A.
record_format MEDLINE/PubMed
spelling pubmed-53345122017-03-17 A Saccade Based Framework for Real-Time Motion Segmentation Using Event Based Vision Sensors Mishra, Abhishek Ghosh, Rohan Principe, Jose C. Thakor, Nitish V. Kukreja, Sunil L. Front Neurosci Neuroscience Motion segmentation is a critical pre-processing step for autonomous robotic systems to facilitate tracking of moving objects in cluttered environments. Event based sensors are low power analog devices that represent a scene by means of asynchronous information updates of only the dynamic details at high temporal resolution and, hence, require significantly less calculations. However, motion segmentation using spatiotemporal data is a challenging task due to data asynchrony. Prior approaches for object tracking using neuromorphic sensors perform well while the sensor is static or a known model of the object to be followed is available. To address these limitations, in this paper we develop a technique for generalized motion segmentation based on spatial statistics across time frames. First, we create micromotion on the platform to facilitate the separation of static and dynamic elements of a scene, inspired by human saccadic eye movements. Second, we introduce the concept of spike-groups as a methodology to partition spatio-temporal event groups, which facilitates computation of scene statistics and characterize objects in it. Experimental results show that our algorithm is able to classify dynamic objects with a moving camera with maximum accuracy of 92%. Frontiers Media S.A. 2017-03-03 /pmc/articles/PMC5334512/ /pubmed/28316563 http://dx.doi.org/10.3389/fnins.2017.00083 Text en Copyright © 2017 Mishra, Ghosh, Principe, Thakor and Kukreja. http://creativecommons.org/licenses/by/4.0/ This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
spellingShingle Neuroscience
Mishra, Abhishek
Ghosh, Rohan
Principe, Jose C.
Thakor, Nitish V.
Kukreja, Sunil L.
A Saccade Based Framework for Real-Time Motion Segmentation Using Event Based Vision Sensors
title A Saccade Based Framework for Real-Time Motion Segmentation Using Event Based Vision Sensors
title_full A Saccade Based Framework for Real-Time Motion Segmentation Using Event Based Vision Sensors
title_fullStr A Saccade Based Framework for Real-Time Motion Segmentation Using Event Based Vision Sensors
title_full_unstemmed A Saccade Based Framework for Real-Time Motion Segmentation Using Event Based Vision Sensors
title_short A Saccade Based Framework for Real-Time Motion Segmentation Using Event Based Vision Sensors
title_sort saccade based framework for real-time motion segmentation using event based vision sensors
topic Neuroscience
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5334512/
https://www.ncbi.nlm.nih.gov/pubmed/28316563
http://dx.doi.org/10.3389/fnins.2017.00083
work_keys_str_mv AT mishraabhishek asaccadebasedframeworkforrealtimemotionsegmentationusingeventbasedvisionsensors
AT ghoshrohan asaccadebasedframeworkforrealtimemotionsegmentationusingeventbasedvisionsensors
AT principejosec asaccadebasedframeworkforrealtimemotionsegmentationusingeventbasedvisionsensors
AT thakornitishv asaccadebasedframeworkforrealtimemotionsegmentationusingeventbasedvisionsensors
AT kukrejasunill asaccadebasedframeworkforrealtimemotionsegmentationusingeventbasedvisionsensors
AT mishraabhishek saccadebasedframeworkforrealtimemotionsegmentationusingeventbasedvisionsensors
AT ghoshrohan saccadebasedframeworkforrealtimemotionsegmentationusingeventbasedvisionsensors
AT principejosec saccadebasedframeworkforrealtimemotionsegmentationusingeventbasedvisionsensors
AT thakornitishv saccadebasedframeworkforrealtimemotionsegmentationusingeventbasedvisionsensors
AT kukrejasunill saccadebasedframeworkforrealtimemotionsegmentationusingeventbasedvisionsensors