Cargando…

Event-driven proto-object based saliency in 3D space to attract a robot’s attention

To interact with its environment, a robot working in 3D space needs to organise its visual input in terms of objects or their perceptual precursors, proto-objects. Among other visual cues, depth is a submodality used to direct attention to visual features and objects. Current depth-based proto-objec...

Descripción completa

Detalles Bibliográficos
Autores principales: Ghosh, Suman, D’Angelo, Giulia, Glover, Arren, Iacono, Massimiliano, Niebur, Ernst, Bartolozzi, Chiara
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Nature Publishing Group UK 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9090933/
https://www.ncbi.nlm.nih.gov/pubmed/35538154
http://dx.doi.org/10.1038/s41598-022-11723-6
_version_ 1784704831780290560
author Ghosh, Suman
D’Angelo, Giulia
Glover, Arren
Iacono, Massimiliano
Niebur, Ernst
Bartolozzi, Chiara
author_facet Ghosh, Suman
D’Angelo, Giulia
Glover, Arren
Iacono, Massimiliano
Niebur, Ernst
Bartolozzi, Chiara
author_sort Ghosh, Suman
collection PubMed
description To interact with its environment, a robot working in 3D space needs to organise its visual input in terms of objects or their perceptual precursors, proto-objects. Among other visual cues, depth is a submodality used to direct attention to visual features and objects. Current depth-based proto-object attention models have been implemented for standard RGB-D cameras that produce synchronous frames. In contrast, event cameras are neuromorphic sensors that loosely mimic the function of the human retina by asynchronously encoding per-pixel brightness changes at very high temporal resolution, thereby providing advantages like high dynamic range, efficiency (thanks to their high degree of signal compression), and low latency. We propose a bio-inspired bottom-up attention model that exploits event-driven sensing to generate depth-based saliency maps that allow a robot to interact with complex visual input. We use event-cameras mounted in the eyes of the iCub humanoid robot to directly extract edge, disparity and motion information. Real-world experiments demonstrate that our system robustly selects salient objects near the robot in the presence of clutter and dynamic scene changes, for the benefit of downstream applications like object segmentation, tracking and robot interaction with external objects.
format Online
Article
Text
id pubmed-9090933
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher Nature Publishing Group UK
record_format MEDLINE/PubMed
spelling pubmed-90909332022-05-12 Event-driven proto-object based saliency in 3D space to attract a robot’s attention Ghosh, Suman D’Angelo, Giulia Glover, Arren Iacono, Massimiliano Niebur, Ernst Bartolozzi, Chiara Sci Rep Article To interact with its environment, a robot working in 3D space needs to organise its visual input in terms of objects or their perceptual precursors, proto-objects. Among other visual cues, depth is a submodality used to direct attention to visual features and objects. Current depth-based proto-object attention models have been implemented for standard RGB-D cameras that produce synchronous frames. In contrast, event cameras are neuromorphic sensors that loosely mimic the function of the human retina by asynchronously encoding per-pixel brightness changes at very high temporal resolution, thereby providing advantages like high dynamic range, efficiency (thanks to their high degree of signal compression), and low latency. We propose a bio-inspired bottom-up attention model that exploits event-driven sensing to generate depth-based saliency maps that allow a robot to interact with complex visual input. We use event-cameras mounted in the eyes of the iCub humanoid robot to directly extract edge, disparity and motion information. Real-world experiments demonstrate that our system robustly selects salient objects near the robot in the presence of clutter and dynamic scene changes, for the benefit of downstream applications like object segmentation, tracking and robot interaction with external objects. Nature Publishing Group UK 2022-05-10 /pmc/articles/PMC9090933/ /pubmed/35538154 http://dx.doi.org/10.1038/s41598-022-11723-6 Text en © The Author(s) 2022 https://creativecommons.org/licenses/by/4.0/Open AccessThis article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) .
spellingShingle Article
Ghosh, Suman
D’Angelo, Giulia
Glover, Arren
Iacono, Massimiliano
Niebur, Ernst
Bartolozzi, Chiara
Event-driven proto-object based saliency in 3D space to attract a robot’s attention
title Event-driven proto-object based saliency in 3D space to attract a robot’s attention
title_full Event-driven proto-object based saliency in 3D space to attract a robot’s attention
title_fullStr Event-driven proto-object based saliency in 3D space to attract a robot’s attention
title_full_unstemmed Event-driven proto-object based saliency in 3D space to attract a robot’s attention
title_short Event-driven proto-object based saliency in 3D space to attract a robot’s attention
title_sort event-driven proto-object based saliency in 3d space to attract a robot’s attention
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9090933/
https://www.ncbi.nlm.nih.gov/pubmed/35538154
http://dx.doi.org/10.1038/s41598-022-11723-6
work_keys_str_mv AT ghoshsuman eventdrivenprotoobjectbasedsaliencyin3dspacetoattractarobotsattention
AT dangelogiulia eventdrivenprotoobjectbasedsaliencyin3dspacetoattractarobotsattention
AT gloverarren eventdrivenprotoobjectbasedsaliencyin3dspacetoattractarobotsattention
AT iaconomassimiliano eventdrivenprotoobjectbasedsaliencyin3dspacetoattractarobotsattention
AT nieburernst eventdrivenprotoobjectbasedsaliencyin3dspacetoattractarobotsattention
AT bartolozzichiara eventdrivenprotoobjectbasedsaliencyin3dspacetoattractarobotsattention