Cargando…

Event-based feature tracking in a visual inertial odometry framework

Introduction: Event cameras report pixel-wise brightness changes at high temporal resolutions, allowing for high speed tracking of features in visual inertial odometry (VIO) estimation, but require a paradigm shift, as common practices from the past decades using conventional cameras, such as featur...

Descripción completa

Detalles Bibliográficos
Autores principales: Ribeiro-Gomes, José, Gaspar, José, Bernardino, Alexandre
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Frontiers Media S.A. 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9971716/
https://www.ncbi.nlm.nih.gov/pubmed/36866151
http://dx.doi.org/10.3389/frobt.2023.994488
_version_ 1784898157477363712
author Ribeiro-Gomes, José
Gaspar, José
Bernardino, Alexandre
author_facet Ribeiro-Gomes, José
Gaspar, José
Bernardino, Alexandre
author_sort Ribeiro-Gomes, José
collection PubMed
description Introduction: Event cameras report pixel-wise brightness changes at high temporal resolutions, allowing for high speed tracking of features in visual inertial odometry (VIO) estimation, but require a paradigm shift, as common practices from the past decades using conventional cameras, such as feature detection and tracking, do not translate directly. One method for feature detection and tracking is the Eventbased Kanade-Lucas-Tomasi tracker (EKLT), an hybrid approach that combines frames with events to provide a high speed tracking of features. Despite the high temporal resolution of the events, the local nature of the registration of features imposes conservative limits to the camera motion speed. Methods: Our proposed approach expands on EKLT by relying on the concurrent use of the event-based feature tracker with a visual inertial odometry system performing pose estimation, leveraging frames, events and Inertial Measurement Unit (IMU) information to improve tracking. The problem of temporally combining high-rate IMU information with asynchronous event cameras is solved by means of an asynchronous probabilistic filter, in particular an Unscented Kalman Filter (UKF). The proposed method of feature tracking based on EKLT takes into account the state estimation of the pose estimator running in parallel and provides this information to the feature tracker, resulting in a synergy that can improve not only the feature tracking, but also the pose estimation. This approach can be seen as a feedback, where the state estimation of the filter is fed back into the tracker, which then produces visual information for the filter, creating a “closed loop”. Results: The method is tested on rotational motions only, and comparisons between a conventional (not event-based) approach and the proposed approach are made, using synthetic and real datasets. Results support that the use of events for the task improve performance. Discussion: To the best of our knowledge, this is the first work proposing the fusion of visual with inertial information using events cameras by means of an UKF, as well as the use of EKLT in the context of pose estimation. Furthermore, our closed loop approach proved to be an improvement over the base EKLT, resulting in better feature tracking and pose estimation. The inertial information, despite prone to drifting over time, allows keeping track of the features that would otherwise be lost. Then, feature tracking synergically helps estimating and minimizing the drift.
format Online
Article
Text
id pubmed-9971716
institution National Center for Biotechnology Information
language English
publishDate 2023
publisher Frontiers Media S.A.
record_format MEDLINE/PubMed
spelling pubmed-99717162023-03-01 Event-based feature tracking in a visual inertial odometry framework Ribeiro-Gomes, José Gaspar, José Bernardino, Alexandre Front Robot AI Robotics and AI Introduction: Event cameras report pixel-wise brightness changes at high temporal resolutions, allowing for high speed tracking of features in visual inertial odometry (VIO) estimation, but require a paradigm shift, as common practices from the past decades using conventional cameras, such as feature detection and tracking, do not translate directly. One method for feature detection and tracking is the Eventbased Kanade-Lucas-Tomasi tracker (EKLT), an hybrid approach that combines frames with events to provide a high speed tracking of features. Despite the high temporal resolution of the events, the local nature of the registration of features imposes conservative limits to the camera motion speed. Methods: Our proposed approach expands on EKLT by relying on the concurrent use of the event-based feature tracker with a visual inertial odometry system performing pose estimation, leveraging frames, events and Inertial Measurement Unit (IMU) information to improve tracking. The problem of temporally combining high-rate IMU information with asynchronous event cameras is solved by means of an asynchronous probabilistic filter, in particular an Unscented Kalman Filter (UKF). The proposed method of feature tracking based on EKLT takes into account the state estimation of the pose estimator running in parallel and provides this information to the feature tracker, resulting in a synergy that can improve not only the feature tracking, but also the pose estimation. This approach can be seen as a feedback, where the state estimation of the filter is fed back into the tracker, which then produces visual information for the filter, creating a “closed loop”. Results: The method is tested on rotational motions only, and comparisons between a conventional (not event-based) approach and the proposed approach are made, using synthetic and real datasets. Results support that the use of events for the task improve performance. Discussion: To the best of our knowledge, this is the first work proposing the fusion of visual with inertial information using events cameras by means of an UKF, as well as the use of EKLT in the context of pose estimation. Furthermore, our closed loop approach proved to be an improvement over the base EKLT, resulting in better feature tracking and pose estimation. The inertial information, despite prone to drifting over time, allows keeping track of the features that would otherwise be lost. Then, feature tracking synergically helps estimating and minimizing the drift. Frontiers Media S.A. 2023-02-14 /pmc/articles/PMC9971716/ /pubmed/36866151 http://dx.doi.org/10.3389/frobt.2023.994488 Text en Copyright © 2023 Ribeiro-Gomes, Gaspar and Bernardino. https://creativecommons.org/licenses/by/4.0/This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
spellingShingle Robotics and AI
Ribeiro-Gomes, José
Gaspar, José
Bernardino, Alexandre
Event-based feature tracking in a visual inertial odometry framework
title Event-based feature tracking in a visual inertial odometry framework
title_full Event-based feature tracking in a visual inertial odometry framework
title_fullStr Event-based feature tracking in a visual inertial odometry framework
title_full_unstemmed Event-based feature tracking in a visual inertial odometry framework
title_short Event-based feature tracking in a visual inertial odometry framework
title_sort event-based feature tracking in a visual inertial odometry framework
topic Robotics and AI
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9971716/
https://www.ncbi.nlm.nih.gov/pubmed/36866151
http://dx.doi.org/10.3389/frobt.2023.994488
work_keys_str_mv AT ribeirogomesjose eventbasedfeaturetrackinginavisualinertialodometryframework
AT gasparjose eventbasedfeaturetrackinginavisualinertialodometryframework
AT bernardinoalexandre eventbasedfeaturetrackinginavisualinertialodometryframework