Cargando…

Optical flow estimation from event-based cameras and spiking neural networks

Event-based cameras are raising interest within the computer vision community. These sensors operate with asynchronous pixels, emitting events, or “spikes”, when the luminance change at a given pixel since the last event surpasses a certain threshold. Thanks to their inherent qualities, such as thei...

Descripción completa

Detalles Bibliográficos
Autores principales: Cuadrado, Javier, Rançon, Ulysse, Cottereau, Benoit R., Barranco, Francisco, Masquelier, Timothée
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Frontiers Media S.A. 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10210135/
https://www.ncbi.nlm.nih.gov/pubmed/37250425
http://dx.doi.org/10.3389/fnins.2023.1160034
_version_ 1785047002864680960
author Cuadrado, Javier
Rançon, Ulysse
Cottereau, Benoit R.
Barranco, Francisco
Masquelier, Timothée
author_facet Cuadrado, Javier
Rançon, Ulysse
Cottereau, Benoit R.
Barranco, Francisco
Masquelier, Timothée
author_sort Cuadrado, Javier
collection PubMed
description Event-based cameras are raising interest within the computer vision community. These sensors operate with asynchronous pixels, emitting events, or “spikes”, when the luminance change at a given pixel since the last event surpasses a certain threshold. Thanks to their inherent qualities, such as their low power consumption, low latency, and high dynamic range, they seem particularly tailored to applications with challenging temporal constraints and safety requirements. Event-based sensors are an excellent fit for Spiking Neural Networks (SNNs), since the coupling of an asynchronous sensor with neuromorphic hardware can yield real-time systems with minimal power requirements. In this work, we seek to develop one such system, using both event sensor data from the DSEC dataset and spiking neural networks to estimate optical flow for driving scenarios. We propose a U-Net-like SNN which, after supervised training, is able to make dense optical flow estimations. To do so, we encourage both minimal norm for the error vector and minimal angle between ground-truth and predicted flow, training our model with back-propagation using a surrogate gradient. In addition, the use of 3d convolutions allows us to capture the dynamic nature of the data by increasing the temporal receptive fields. Upsampling after each decoding stage ensures that each decoder's output contributes to the final estimation. Thanks to separable convolutions, we have been able to develop a light model (when compared to competitors) that can nonetheless yield reasonably accurate optical flow estimates.
format Online
Article
Text
id pubmed-10210135
institution National Center for Biotechnology Information
language English
publishDate 2023
publisher Frontiers Media S.A.
record_format MEDLINE/PubMed
spelling pubmed-102101352023-05-26 Optical flow estimation from event-based cameras and spiking neural networks Cuadrado, Javier Rançon, Ulysse Cottereau, Benoit R. Barranco, Francisco Masquelier, Timothée Front Neurosci Neuroscience Event-based cameras are raising interest within the computer vision community. These sensors operate with asynchronous pixels, emitting events, or “spikes”, when the luminance change at a given pixel since the last event surpasses a certain threshold. Thanks to their inherent qualities, such as their low power consumption, low latency, and high dynamic range, they seem particularly tailored to applications with challenging temporal constraints and safety requirements. Event-based sensors are an excellent fit for Spiking Neural Networks (SNNs), since the coupling of an asynchronous sensor with neuromorphic hardware can yield real-time systems with minimal power requirements. In this work, we seek to develop one such system, using both event sensor data from the DSEC dataset and spiking neural networks to estimate optical flow for driving scenarios. We propose a U-Net-like SNN which, after supervised training, is able to make dense optical flow estimations. To do so, we encourage both minimal norm for the error vector and minimal angle between ground-truth and predicted flow, training our model with back-propagation using a surrogate gradient. In addition, the use of 3d convolutions allows us to capture the dynamic nature of the data by increasing the temporal receptive fields. Upsampling after each decoding stage ensures that each decoder's output contributes to the final estimation. Thanks to separable convolutions, we have been able to develop a light model (when compared to competitors) that can nonetheless yield reasonably accurate optical flow estimates. Frontiers Media S.A. 2023-05-11 /pmc/articles/PMC10210135/ /pubmed/37250425 http://dx.doi.org/10.3389/fnins.2023.1160034 Text en Copyright © 2023 Cuadrado, Rançon, Cottereau, Barranco and Masquelier. https://creativecommons.org/licenses/by/4.0/This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
spellingShingle Neuroscience
Cuadrado, Javier
Rançon, Ulysse
Cottereau, Benoit R.
Barranco, Francisco
Masquelier, Timothée
Optical flow estimation from event-based cameras and spiking neural networks
title Optical flow estimation from event-based cameras and spiking neural networks
title_full Optical flow estimation from event-based cameras and spiking neural networks
title_fullStr Optical flow estimation from event-based cameras and spiking neural networks
title_full_unstemmed Optical flow estimation from event-based cameras and spiking neural networks
title_short Optical flow estimation from event-based cameras and spiking neural networks
title_sort optical flow estimation from event-based cameras and spiking neural networks
topic Neuroscience
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10210135/
https://www.ncbi.nlm.nih.gov/pubmed/37250425
http://dx.doi.org/10.3389/fnins.2023.1160034
work_keys_str_mv AT cuadradojavier opticalflowestimationfromeventbasedcamerasandspikingneuralnetworks
AT ranconulysse opticalflowestimationfromeventbasedcamerasandspikingneuralnetworks
AT cottereaubenoitr opticalflowestimationfromeventbasedcamerasandspikingneuralnetworks
AT barrancofrancisco opticalflowestimationfromeventbasedcamerasandspikingneuralnetworks
AT masqueliertimothee opticalflowestimationfromeventbasedcamerasandspikingneuralnetworks