Cargando…

EVtracker: An Event-Driven Spatiotemporal Method for Dynamic Object Tracking

An event camera is a novel bio-inspired sensor that effectively compensates for the shortcomings of current frame cameras, which include high latency, low dynamic range, motion blur, etc. Rather than capturing images at a fixed frame rate, an event camera produces an asynchronous signal by measuring...

Descripción completa

Detalles Bibliográficos
Autores principales: Zhang, Shixiong, Wang, Wenmin, Li, Honglei, Zhang, Shenyong
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9414578/
https://www.ncbi.nlm.nih.gov/pubmed/36015851
http://dx.doi.org/10.3390/s22166090
_version_ 1784776022565060608
author Zhang, Shixiong
Wang, Wenmin
Li, Honglei
Zhang, Shenyong
author_facet Zhang, Shixiong
Wang, Wenmin
Li, Honglei
Zhang, Shenyong
author_sort Zhang, Shixiong
collection PubMed
description An event camera is a novel bio-inspired sensor that effectively compensates for the shortcomings of current frame cameras, which include high latency, low dynamic range, motion blur, etc. Rather than capturing images at a fixed frame rate, an event camera produces an asynchronous signal by measuring the brightness change of each pixel. Consequently, an appropriate algorithm framework that can handle the unique data types of event-based vision is required. In this paper, we propose a dynamic object tracking framework using an event camera to achieve long-term stable tracking of event objects. One of the key novel features of our approach is to adopt an adaptive strategy that adjusts the spatiotemporal domain of event data. To achieve this, we reconstruct event images from high-speed asynchronous streaming data via online learning. Additionally, we apply the Siamese network to extract features from event data. In contrast to earlier models that only extract hand-crafted features, our method provides powerful feature description and a more flexible reconstruction strategy for event data. We assess our algorithm in three challenging scenarios: 6-DoF (six degrees of freedom), translation, and rotation. Unlike fixed cameras in traditional object tracking tasks, all three tracking scenarios involve the simultaneous violent rotation and shaking of both the camera and objects. Results from extensive experiments suggest that our proposed approach achieves superior accuracy and robustness compared to other state-of-the-art methods. Without reducing time efficiency, our novel method exhibits a [Formula: see text] increase in accuracy over other recent models. Furthermore, results indicate that event cameras are capable of robust object tracking, which is a task that conventional cameras cannot adequately perform, especially for super-fast motion tracking and challenging lighting situations.
format Online
Article
Text
id pubmed-9414578
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-94145782022-08-27 EVtracker: An Event-Driven Spatiotemporal Method for Dynamic Object Tracking Zhang, Shixiong Wang, Wenmin Li, Honglei Zhang, Shenyong Sensors (Basel) Article An event camera is a novel bio-inspired sensor that effectively compensates for the shortcomings of current frame cameras, which include high latency, low dynamic range, motion blur, etc. Rather than capturing images at a fixed frame rate, an event camera produces an asynchronous signal by measuring the brightness change of each pixel. Consequently, an appropriate algorithm framework that can handle the unique data types of event-based vision is required. In this paper, we propose a dynamic object tracking framework using an event camera to achieve long-term stable tracking of event objects. One of the key novel features of our approach is to adopt an adaptive strategy that adjusts the spatiotemporal domain of event data. To achieve this, we reconstruct event images from high-speed asynchronous streaming data via online learning. Additionally, we apply the Siamese network to extract features from event data. In contrast to earlier models that only extract hand-crafted features, our method provides powerful feature description and a more flexible reconstruction strategy for event data. We assess our algorithm in three challenging scenarios: 6-DoF (six degrees of freedom), translation, and rotation. Unlike fixed cameras in traditional object tracking tasks, all three tracking scenarios involve the simultaneous violent rotation and shaking of both the camera and objects. Results from extensive experiments suggest that our proposed approach achieves superior accuracy and robustness compared to other state-of-the-art methods. Without reducing time efficiency, our novel method exhibits a [Formula: see text] increase in accuracy over other recent models. Furthermore, results indicate that event cameras are capable of robust object tracking, which is a task that conventional cameras cannot adequately perform, especially for super-fast motion tracking and challenging lighting situations. MDPI 2022-08-15 /pmc/articles/PMC9414578/ /pubmed/36015851 http://dx.doi.org/10.3390/s22166090 Text en © 2022 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Zhang, Shixiong
Wang, Wenmin
Li, Honglei
Zhang, Shenyong
EVtracker: An Event-Driven Spatiotemporal Method for Dynamic Object Tracking
title EVtracker: An Event-Driven Spatiotemporal Method for Dynamic Object Tracking
title_full EVtracker: An Event-Driven Spatiotemporal Method for Dynamic Object Tracking
title_fullStr EVtracker: An Event-Driven Spatiotemporal Method for Dynamic Object Tracking
title_full_unstemmed EVtracker: An Event-Driven Spatiotemporal Method for Dynamic Object Tracking
title_short EVtracker: An Event-Driven Spatiotemporal Method for Dynamic Object Tracking
title_sort evtracker: an event-driven spatiotemporal method for dynamic object tracking
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9414578/
https://www.ncbi.nlm.nih.gov/pubmed/36015851
http://dx.doi.org/10.3390/s22166090
work_keys_str_mv AT zhangshixiong evtrackeraneventdrivenspatiotemporalmethodfordynamicobjecttracking
AT wangwenmin evtrackeraneventdrivenspatiotemporalmethodfordynamicobjecttracking
AT lihonglei evtrackeraneventdrivenspatiotemporalmethodfordynamicobjecttracking
AT zhangshenyong evtrackeraneventdrivenspatiotemporalmethodfordynamicobjecttracking