Cargando…

Accuracy and Speed Improvement of Event Camera Motion Estimation Using a Bird’s-Eye View Transformation

Event cameras are bio-inspired sensors that have a high dynamic range and temporal resolution. This property enables motion estimation from textures with repeating patterns, which is difficult to achieve with RGB cameras. Therefore, motion estimation of an event camera is expected to be applied to v...

Descripción completa

Detalles Bibliográficos
Autores principales: Ozawa, Takehiro, Sekikawa, Yusuke, Saito, Hideo
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8840125/
https://www.ncbi.nlm.nih.gov/pubmed/35161519
http://dx.doi.org/10.3390/s22030773
_version_ 1784650542134329344
author Ozawa, Takehiro
Sekikawa, Yusuke
Saito, Hideo
author_facet Ozawa, Takehiro
Sekikawa, Yusuke
Saito, Hideo
author_sort Ozawa, Takehiro
collection PubMed
description Event cameras are bio-inspired sensors that have a high dynamic range and temporal resolution. This property enables motion estimation from textures with repeating patterns, which is difficult to achieve with RGB cameras. Therefore, motion estimation of an event camera is expected to be applied to vehicle position estimation. An existing method, called contrast maximization, is one of the methods that can be used for event camera motion estimation by capturing road surfaces. However, contrast maximization tends to fall into a local solution when estimating three-dimensional motion, which makes correct estimation difficult. To solve this problem, we propose a method for motion estimation by optimizing contrast in the bird’s-eye view space. Instead of performing three-dimensional motion estimation, we reduced the dimensionality to two-dimensional motion estimation by transforming the event data to a bird’s-eye view using homography calculated from the event camera position. This transformation mitigates the problem of the loss function becoming non-convex, which occurs in conventional methods. As a quantitative experiment, we created event data by using a car simulator and evaluated our motion estimation method, showing an improvement in accuracy and speed. In addition, we conducted estimation from real event data and evaluated the results qualitatively, showing an improvement in accuracy.
format Online
Article
Text
id pubmed-8840125
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-88401252022-02-13 Accuracy and Speed Improvement of Event Camera Motion Estimation Using a Bird’s-Eye View Transformation Ozawa, Takehiro Sekikawa, Yusuke Saito, Hideo Sensors (Basel) Article Event cameras are bio-inspired sensors that have a high dynamic range and temporal resolution. This property enables motion estimation from textures with repeating patterns, which is difficult to achieve with RGB cameras. Therefore, motion estimation of an event camera is expected to be applied to vehicle position estimation. An existing method, called contrast maximization, is one of the methods that can be used for event camera motion estimation by capturing road surfaces. However, contrast maximization tends to fall into a local solution when estimating three-dimensional motion, which makes correct estimation difficult. To solve this problem, we propose a method for motion estimation by optimizing contrast in the bird’s-eye view space. Instead of performing three-dimensional motion estimation, we reduced the dimensionality to two-dimensional motion estimation by transforming the event data to a bird’s-eye view using homography calculated from the event camera position. This transformation mitigates the problem of the loss function becoming non-convex, which occurs in conventional methods. As a quantitative experiment, we created event data by using a car simulator and evaluated our motion estimation method, showing an improvement in accuracy and speed. In addition, we conducted estimation from real event data and evaluated the results qualitatively, showing an improvement in accuracy. MDPI 2022-01-20 /pmc/articles/PMC8840125/ /pubmed/35161519 http://dx.doi.org/10.3390/s22030773 Text en © 2022 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Ozawa, Takehiro
Sekikawa, Yusuke
Saito, Hideo
Accuracy and Speed Improvement of Event Camera Motion Estimation Using a Bird’s-Eye View Transformation
title Accuracy and Speed Improvement of Event Camera Motion Estimation Using a Bird’s-Eye View Transformation
title_full Accuracy and Speed Improvement of Event Camera Motion Estimation Using a Bird’s-Eye View Transformation
title_fullStr Accuracy and Speed Improvement of Event Camera Motion Estimation Using a Bird’s-Eye View Transformation
title_full_unstemmed Accuracy and Speed Improvement of Event Camera Motion Estimation Using a Bird’s-Eye View Transformation
title_short Accuracy and Speed Improvement of Event Camera Motion Estimation Using a Bird’s-Eye View Transformation
title_sort accuracy and speed improvement of event camera motion estimation using a bird’s-eye view transformation
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8840125/
https://www.ncbi.nlm.nih.gov/pubmed/35161519
http://dx.doi.org/10.3390/s22030773
work_keys_str_mv AT ozawatakehiro accuracyandspeedimprovementofeventcameramotionestimationusingabirdseyeviewtransformation
AT sekikawayusuke accuracyandspeedimprovementofeventcameramotionestimationusingabirdseyeviewtransformation
AT saitohideo accuracyandspeedimprovementofeventcameramotionestimationusingabirdseyeviewtransformation