Cargando…
SCTN: Event-based object tracking with energy-efficient deep convolutional spiking neural networks
Event cameras are asynchronous and neuromorphically inspired visual sensors, which have shown great potential in object tracking because they can easily detect moving objects. Since event cameras output discrete events, they are inherently suitable to coordinate with Spiking Neural Network (SNN), wh...
Autores principales: | , , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Frontiers Media S.A.
2023
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9978206/ https://www.ncbi.nlm.nih.gov/pubmed/36875665 http://dx.doi.org/10.3389/fnins.2023.1123698 |
_version_ | 1784899467717115904 |
---|---|
author | Ji, Mingcheng Wang, Ziling Yan, Rui Liu, Qingjie Xu, Shu Tang, Huajin |
author_facet | Ji, Mingcheng Wang, Ziling Yan, Rui Liu, Qingjie Xu, Shu Tang, Huajin |
author_sort | Ji, Mingcheng |
collection | PubMed |
description | Event cameras are asynchronous and neuromorphically inspired visual sensors, which have shown great potential in object tracking because they can easily detect moving objects. Since event cameras output discrete events, they are inherently suitable to coordinate with Spiking Neural Network (SNN), which has a unique event-driven computation characteristic and energy-efficient computing. In this paper, we tackle the problem of event-based object tracking by a novel architecture with a discriminatively trained SNN, called the Spiking Convolutional Tracking Network (SCTN). Taking a segment of events as input, SCTN not only better exploits implicit associations among events rather than event-wise processing, but also fully utilizes precise temporal information and maintains the sparse representation in segments instead of frames. To make SCTN more suitable for object tracking, we propose a new loss function that introduces an exponential Intersection over Union (IoU) in the voltage domain. To the best of our knowledge, this is the first tracking network directly trained with SNN. Besides, we present a new event-based tracking dataset, dubbed DVSOT21. In contrast to other competing trackers, experimental results on DVSOT21 demonstrate that our method achieves competitive performance with very low energy consumption compared to ANN based trackers with very low energy consumption compared to ANN based trackers. With lower energy consumption, tracking on neuromorphic hardware will reveal its advantage. |
format | Online Article Text |
id | pubmed-9978206 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2023 |
publisher | Frontiers Media S.A. |
record_format | MEDLINE/PubMed |
spelling | pubmed-99782062023-03-03 SCTN: Event-based object tracking with energy-efficient deep convolutional spiking neural networks Ji, Mingcheng Wang, Ziling Yan, Rui Liu, Qingjie Xu, Shu Tang, Huajin Front Neurosci Neuroscience Event cameras are asynchronous and neuromorphically inspired visual sensors, which have shown great potential in object tracking because they can easily detect moving objects. Since event cameras output discrete events, they are inherently suitable to coordinate with Spiking Neural Network (SNN), which has a unique event-driven computation characteristic and energy-efficient computing. In this paper, we tackle the problem of event-based object tracking by a novel architecture with a discriminatively trained SNN, called the Spiking Convolutional Tracking Network (SCTN). Taking a segment of events as input, SCTN not only better exploits implicit associations among events rather than event-wise processing, but also fully utilizes precise temporal information and maintains the sparse representation in segments instead of frames. To make SCTN more suitable for object tracking, we propose a new loss function that introduces an exponential Intersection over Union (IoU) in the voltage domain. To the best of our knowledge, this is the first tracking network directly trained with SNN. Besides, we present a new event-based tracking dataset, dubbed DVSOT21. In contrast to other competing trackers, experimental results on DVSOT21 demonstrate that our method achieves competitive performance with very low energy consumption compared to ANN based trackers with very low energy consumption compared to ANN based trackers. With lower energy consumption, tracking on neuromorphic hardware will reveal its advantage. Frontiers Media S.A. 2023-02-16 /pmc/articles/PMC9978206/ /pubmed/36875665 http://dx.doi.org/10.3389/fnins.2023.1123698 Text en Copyright © 2023 Ji, Wang, Yan, Liu, Xu and Tang. https://creativecommons.org/licenses/by/4.0/This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms. |
spellingShingle | Neuroscience Ji, Mingcheng Wang, Ziling Yan, Rui Liu, Qingjie Xu, Shu Tang, Huajin SCTN: Event-based object tracking with energy-efficient deep convolutional spiking neural networks |
title | SCTN: Event-based object tracking with energy-efficient deep convolutional spiking neural networks |
title_full | SCTN: Event-based object tracking with energy-efficient deep convolutional spiking neural networks |
title_fullStr | SCTN: Event-based object tracking with energy-efficient deep convolutional spiking neural networks |
title_full_unstemmed | SCTN: Event-based object tracking with energy-efficient deep convolutional spiking neural networks |
title_short | SCTN: Event-based object tracking with energy-efficient deep convolutional spiking neural networks |
title_sort | sctn: event-based object tracking with energy-efficient deep convolutional spiking neural networks |
topic | Neuroscience |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9978206/ https://www.ncbi.nlm.nih.gov/pubmed/36875665 http://dx.doi.org/10.3389/fnins.2023.1123698 |
work_keys_str_mv | AT jimingcheng sctneventbasedobjecttrackingwithenergyefficientdeepconvolutionalspikingneuralnetworks AT wangziling sctneventbasedobjecttrackingwithenergyefficientdeepconvolutionalspikingneuralnetworks AT yanrui sctneventbasedobjecttrackingwithenergyefficientdeepconvolutionalspikingneuralnetworks AT liuqingjie sctneventbasedobjecttrackingwithenergyefficientdeepconvolutionalspikingneuralnetworks AT xushu sctneventbasedobjecttrackingwithenergyefficientdeepconvolutionalspikingneuralnetworks AT tanghuajin sctneventbasedobjecttrackingwithenergyefficientdeepconvolutionalspikingneuralnetworks |