Cargando…

FPGA-Based Multimodal Embedded Sensor System Integrating Low- and Mid-Level Vision

Motion estimation is a low-level vision task that is especially relevant due to its wide range of applications in the real world. Many of the best motion estimation algorithms include some of the features that are found in mammalians, which would demand huge computational resources and therefore are...

Descripción completa

Detalles Bibliográficos
Autores principales: Botella, Guillermo, Martín H., José Antonio, Santos, Matilde, Meyer-Baese, Uwe
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Molecular Diversity Preservation International (MDPI) 2011
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3231703/
https://www.ncbi.nlm.nih.gov/pubmed/22164069
http://dx.doi.org/10.3390/s110808164
Descripción
Sumario:Motion estimation is a low-level vision task that is especially relevant due to its wide range of applications in the real world. Many of the best motion estimation algorithms include some of the features that are found in mammalians, which would demand huge computational resources and therefore are not usually available in real-time. In this paper we present a novel bioinspired sensor based on the synergy between optical flow and orthogonal variant moments. The bioinspired sensor has been designed for Very Large Scale Integration (VLSI) using properties of the mammalian cortical motion pathway. This sensor combines low-level primitives (optical flow and image moments) in order to produce a mid-level vision abstraction layer. The results are described trough experiments showing the validity of the proposed system and an analysis of the computational resources and performance of the applied algorithms.