Cargando…
A Multi-Modal Person Perception Framework for Socially Interactive Mobile Service Robots
In order to meet the increasing demands of mobile service robot applications, a dedicated perception module is an essential requirement for the interaction with users in real-world scenarios. In particular, multi sensor fusion and human re-identification are recognized as active research fronts. Thr...
Autores principales: | , , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
MDPI
2020
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7038368/ https://www.ncbi.nlm.nih.gov/pubmed/32012943 http://dx.doi.org/10.3390/s20030722 |
_version_ | 1783500624239788032 |
---|---|
author | Müller, Steffen Wengefeld, Tim Trinh, Thanh Quang Aganian, Dustin Eisenbach, Markus Gross, Horst-Michael |
author_facet | Müller, Steffen Wengefeld, Tim Trinh, Thanh Quang Aganian, Dustin Eisenbach, Markus Gross, Horst-Michael |
author_sort | Müller, Steffen |
collection | PubMed |
description | In order to meet the increasing demands of mobile service robot applications, a dedicated perception module is an essential requirement for the interaction with users in real-world scenarios. In particular, multi sensor fusion and human re-identification are recognized as active research fronts. Through this paper we contribute to the topic and present a modular detection and tracking system that models position and additional properties of persons in the surroundings of a mobile robot. The proposed system introduces a probability-based data association method that besides the position can incorporate face and color-based appearance features in order to realize a re-identification of persons when tracking gets interrupted. The system combines the results of various state-of-the-art image-based detection systems for person recognition, person identification and attribute estimation. This allows a stable estimate of a mobile robot’s user, even in complex, cluttered environments with long-lasting occlusions. In our benchmark, we introduce a new measure for tracking consistency and show the improvements when face and appearance-based re-identification are combined. The tracking system was applied in a real world application with a mobile rehabilitation assistant robot in a public hospital. The estimated states of persons are used for the user-centered navigation behaviors, e.g., guiding or approaching a person, but also for realizing a socially acceptable navigation in public environments. |
format | Online Article Text |
id | pubmed-7038368 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2020 |
publisher | MDPI |
record_format | MEDLINE/PubMed |
spelling | pubmed-70383682020-03-09 A Multi-Modal Person Perception Framework for Socially Interactive Mobile Service Robots Müller, Steffen Wengefeld, Tim Trinh, Thanh Quang Aganian, Dustin Eisenbach, Markus Gross, Horst-Michael Sensors (Basel) Article In order to meet the increasing demands of mobile service robot applications, a dedicated perception module is an essential requirement for the interaction with users in real-world scenarios. In particular, multi sensor fusion and human re-identification are recognized as active research fronts. Through this paper we contribute to the topic and present a modular detection and tracking system that models position and additional properties of persons in the surroundings of a mobile robot. The proposed system introduces a probability-based data association method that besides the position can incorporate face and color-based appearance features in order to realize a re-identification of persons when tracking gets interrupted. The system combines the results of various state-of-the-art image-based detection systems for person recognition, person identification and attribute estimation. This allows a stable estimate of a mobile robot’s user, even in complex, cluttered environments with long-lasting occlusions. In our benchmark, we introduce a new measure for tracking consistency and show the improvements when face and appearance-based re-identification are combined. The tracking system was applied in a real world application with a mobile rehabilitation assistant robot in a public hospital. The estimated states of persons are used for the user-centered navigation behaviors, e.g., guiding or approaching a person, but also for realizing a socially acceptable navigation in public environments. MDPI 2020-01-28 /pmc/articles/PMC7038368/ /pubmed/32012943 http://dx.doi.org/10.3390/s20030722 Text en © 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/). |
spellingShingle | Article Müller, Steffen Wengefeld, Tim Trinh, Thanh Quang Aganian, Dustin Eisenbach, Markus Gross, Horst-Michael A Multi-Modal Person Perception Framework for Socially Interactive Mobile Service Robots |
title | A Multi-Modal Person Perception Framework for Socially Interactive Mobile Service Robots |
title_full | A Multi-Modal Person Perception Framework for Socially Interactive Mobile Service Robots |
title_fullStr | A Multi-Modal Person Perception Framework for Socially Interactive Mobile Service Robots |
title_full_unstemmed | A Multi-Modal Person Perception Framework for Socially Interactive Mobile Service Robots |
title_short | A Multi-Modal Person Perception Framework for Socially Interactive Mobile Service Robots |
title_sort | multi-modal person perception framework for socially interactive mobile service robots |
topic | Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7038368/ https://www.ncbi.nlm.nih.gov/pubmed/32012943 http://dx.doi.org/10.3390/s20030722 |
work_keys_str_mv | AT mullersteffen amultimodalpersonperceptionframeworkforsociallyinteractivemobileservicerobots AT wengefeldtim amultimodalpersonperceptionframeworkforsociallyinteractivemobileservicerobots AT trinhthanhquang amultimodalpersonperceptionframeworkforsociallyinteractivemobileservicerobots AT aganiandustin amultimodalpersonperceptionframeworkforsociallyinteractivemobileservicerobots AT eisenbachmarkus amultimodalpersonperceptionframeworkforsociallyinteractivemobileservicerobots AT grosshorstmichael amultimodalpersonperceptionframeworkforsociallyinteractivemobileservicerobots AT mullersteffen multimodalpersonperceptionframeworkforsociallyinteractivemobileservicerobots AT wengefeldtim multimodalpersonperceptionframeworkforsociallyinteractivemobileservicerobots AT trinhthanhquang multimodalpersonperceptionframeworkforsociallyinteractivemobileservicerobots AT aganiandustin multimodalpersonperceptionframeworkforsociallyinteractivemobileservicerobots AT eisenbachmarkus multimodalpersonperceptionframeworkforsociallyinteractivemobileservicerobots AT grosshorstmichael multimodalpersonperceptionframeworkforsociallyinteractivemobileservicerobots |