Cargando…
A Benchmark Environment for Neuromorphic Stereo Vision
Without neuromorphic hardware, artificial stereo vision suffers from high resource demands and processing times impeding real-time capability. This is mainly caused by high frame rates, a quality feature for conventional cameras, generating large amounts of redundant data. Neuromorphic visual sensor...
Autores principales: | , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Frontiers Media S.A.
2021
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8170485/ https://www.ncbi.nlm.nih.gov/pubmed/34095240 http://dx.doi.org/10.3389/frobt.2021.647634 |
_version_ | 1783702256807313408 |
---|---|
author | Steffen, L. Elfgen, M. Ulbrich, S. Roennau, A. Dillmann, R. |
author_facet | Steffen, L. Elfgen, M. Ulbrich, S. Roennau, A. Dillmann, R. |
author_sort | Steffen, L. |
collection | PubMed |
description | Without neuromorphic hardware, artificial stereo vision suffers from high resource demands and processing times impeding real-time capability. This is mainly caused by high frame rates, a quality feature for conventional cameras, generating large amounts of redundant data. Neuromorphic visual sensors generate less redundant and more relevant data solving the issue of over- and undersampling at the same time. However, they require a rethinking of processing as established techniques in conventional stereo vision do not exploit the potential of their event-based operation principle. Many alternatives have been recently proposed which have yet to be evaluated on a common data basis. We propose a benchmark environment offering the methods and tools to compare different algorithms for depth reconstruction from two event-based sensors. To this end, an experimental setup consisting of two event-based and one depth sensor as well as a framework enabling synchronized, calibrated data recording is presented. Furthermore, we define metrics enabling a meaningful comparison of the examined algorithms, covering aspects such as performance, precision and applicability. To evaluate the benchmark, a stereo matching algorithm was implemented as a testing candidate and multiple experiments with different settings and camera parameters have been carried out. This work is a foundation for a robust and flexible evaluation of the multitude of new techniques for event-based stereo vision, allowing a meaningful comparison. |
format | Online Article Text |
id | pubmed-8170485 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2021 |
publisher | Frontiers Media S.A. |
record_format | MEDLINE/PubMed |
spelling | pubmed-81704852021-06-03 A Benchmark Environment for Neuromorphic Stereo Vision Steffen, L. Elfgen, M. Ulbrich, S. Roennau, A. Dillmann, R. Front Robot AI Robotics and AI Without neuromorphic hardware, artificial stereo vision suffers from high resource demands and processing times impeding real-time capability. This is mainly caused by high frame rates, a quality feature for conventional cameras, generating large amounts of redundant data. Neuromorphic visual sensors generate less redundant and more relevant data solving the issue of over- and undersampling at the same time. However, they require a rethinking of processing as established techniques in conventional stereo vision do not exploit the potential of their event-based operation principle. Many alternatives have been recently proposed which have yet to be evaluated on a common data basis. We propose a benchmark environment offering the methods and tools to compare different algorithms for depth reconstruction from two event-based sensors. To this end, an experimental setup consisting of two event-based and one depth sensor as well as a framework enabling synchronized, calibrated data recording is presented. Furthermore, we define metrics enabling a meaningful comparison of the examined algorithms, covering aspects such as performance, precision and applicability. To evaluate the benchmark, a stereo matching algorithm was implemented as a testing candidate and multiple experiments with different settings and camera parameters have been carried out. This work is a foundation for a robust and flexible evaluation of the multitude of new techniques for event-based stereo vision, allowing a meaningful comparison. Frontiers Media S.A. 2021-05-19 /pmc/articles/PMC8170485/ /pubmed/34095240 http://dx.doi.org/10.3389/frobt.2021.647634 Text en Copyright © 2021 Steffen, Elfgen, Ulbrich, Roennau and Dillmann. https://creativecommons.org/licenses/by/4.0/This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms. |
spellingShingle | Robotics and AI Steffen, L. Elfgen, M. Ulbrich, S. Roennau, A. Dillmann, R. A Benchmark Environment for Neuromorphic Stereo Vision |
title | A Benchmark Environment for Neuromorphic Stereo Vision |
title_full | A Benchmark Environment for Neuromorphic Stereo Vision |
title_fullStr | A Benchmark Environment for Neuromorphic Stereo Vision |
title_full_unstemmed | A Benchmark Environment for Neuromorphic Stereo Vision |
title_short | A Benchmark Environment for Neuromorphic Stereo Vision |
title_sort | benchmark environment for neuromorphic stereo vision |
topic | Robotics and AI |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8170485/ https://www.ncbi.nlm.nih.gov/pubmed/34095240 http://dx.doi.org/10.3389/frobt.2021.647634 |
work_keys_str_mv | AT steffenl abenchmarkenvironmentforneuromorphicstereovision AT elfgenm abenchmarkenvironmentforneuromorphicstereovision AT ulbrichs abenchmarkenvironmentforneuromorphicstereovision AT roennaua abenchmarkenvironmentforneuromorphicstereovision AT dillmannr abenchmarkenvironmentforneuromorphicstereovision AT steffenl benchmarkenvironmentforneuromorphicstereovision AT elfgenm benchmarkenvironmentforneuromorphicstereovision AT ulbrichs benchmarkenvironmentforneuromorphicstereovision AT roennaua benchmarkenvironmentforneuromorphicstereovision AT dillmannr benchmarkenvironmentforneuromorphicstereovision |