Cargando…
A Novel Method for Estimating Distances from a Robot to Humans Using Egocentric RGB Camera
Estimating distances between people and robots plays a crucial role in understanding social Human–Robot Interaction (HRI) from an egocentric view. It is a key step if robots should engage in social interactions, and to collaborate with people as part of human–robot teams. For distance estimation bet...
Autores principales: | , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
MDPI
2019
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6679565/ https://www.ncbi.nlm.nih.gov/pubmed/31319523 http://dx.doi.org/10.3390/s19143142 |
_version_ | 1783441364735754240 |
---|---|
author | Pathi, Sai Krishna Kiselev, Andrey Kristoffersson, Annica Repsilber, Dirk Loutfi, Amy |
author_facet | Pathi, Sai Krishna Kiselev, Andrey Kristoffersson, Annica Repsilber, Dirk Loutfi, Amy |
author_sort | Pathi, Sai Krishna |
collection | PubMed |
description | Estimating distances between people and robots plays a crucial role in understanding social Human–Robot Interaction (HRI) from an egocentric view. It is a key step if robots should engage in social interactions, and to collaborate with people as part of human–robot teams. For distance estimation between a person and a robot, different sensors can be employed, and the number of challenges to be addressed by the distance estimation methods rise with the simplicity of the technology of a sensor. In the case of estimating distances using individual images from a single camera in a egocentric position, it is often required that individuals in the scene are facing the camera, do not occlude each other, and are fairly visible so specific facial or body features can be identified. In this paper, we propose a novel method for estimating distances between a robot and people using single images from a single egocentric camera. The method is based on previously proven 2D pose estimation, which allows partial occlusions, cluttered background, and relatively low resolution. The method estimates distance with respect to the camera based on the Euclidean distance between ear and torso of people in the image plane. Ear and torso characteristic points has been selected based on their relatively high visibility regardless of a person orientation and a certain degree of uniformity with regard to the age and gender. Experimental validation demonstrates effectiveness of the proposed method. |
format | Online Article Text |
id | pubmed-6679565 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2019 |
publisher | MDPI |
record_format | MEDLINE/PubMed |
spelling | pubmed-66795652019-08-19 A Novel Method for Estimating Distances from a Robot to Humans Using Egocentric RGB Camera Pathi, Sai Krishna Kiselev, Andrey Kristoffersson, Annica Repsilber, Dirk Loutfi, Amy Sensors (Basel) Article Estimating distances between people and robots plays a crucial role in understanding social Human–Robot Interaction (HRI) from an egocentric view. It is a key step if robots should engage in social interactions, and to collaborate with people as part of human–robot teams. For distance estimation between a person and a robot, different sensors can be employed, and the number of challenges to be addressed by the distance estimation methods rise with the simplicity of the technology of a sensor. In the case of estimating distances using individual images from a single camera in a egocentric position, it is often required that individuals in the scene are facing the camera, do not occlude each other, and are fairly visible so specific facial or body features can be identified. In this paper, we propose a novel method for estimating distances between a robot and people using single images from a single egocentric camera. The method is based on previously proven 2D pose estimation, which allows partial occlusions, cluttered background, and relatively low resolution. The method estimates distance with respect to the camera based on the Euclidean distance between ear and torso of people in the image plane. Ear and torso characteristic points has been selected based on their relatively high visibility regardless of a person orientation and a certain degree of uniformity with regard to the age and gender. Experimental validation demonstrates effectiveness of the proposed method. MDPI 2019-07-17 /pmc/articles/PMC6679565/ /pubmed/31319523 http://dx.doi.org/10.3390/s19143142 Text en © 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/). |
spellingShingle | Article Pathi, Sai Krishna Kiselev, Andrey Kristoffersson, Annica Repsilber, Dirk Loutfi, Amy A Novel Method for Estimating Distances from a Robot to Humans Using Egocentric RGB Camera |
title | A Novel Method for Estimating Distances from a Robot to Humans Using Egocentric RGB Camera |
title_full | A Novel Method for Estimating Distances from a Robot to Humans Using Egocentric RGB Camera |
title_fullStr | A Novel Method for Estimating Distances from a Robot to Humans Using Egocentric RGB Camera |
title_full_unstemmed | A Novel Method for Estimating Distances from a Robot to Humans Using Egocentric RGB Camera |
title_short | A Novel Method for Estimating Distances from a Robot to Humans Using Egocentric RGB Camera |
title_sort | novel method for estimating distances from a robot to humans using egocentric rgb camera |
topic | Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6679565/ https://www.ncbi.nlm.nih.gov/pubmed/31319523 http://dx.doi.org/10.3390/s19143142 |
work_keys_str_mv | AT pathisaikrishna anovelmethodforestimatingdistancesfromarobottohumansusingegocentricrgbcamera AT kiselevandrey anovelmethodforestimatingdistancesfromarobottohumansusingegocentricrgbcamera AT kristofferssonannica anovelmethodforestimatingdistancesfromarobottohumansusingegocentricrgbcamera AT repsilberdirk anovelmethodforestimatingdistancesfromarobottohumansusingegocentricrgbcamera AT loutfiamy anovelmethodforestimatingdistancesfromarobottohumansusingegocentricrgbcamera AT pathisaikrishna novelmethodforestimatingdistancesfromarobottohumansusingegocentricrgbcamera AT kiselevandrey novelmethodforestimatingdistancesfromarobottohumansusingegocentricrgbcamera AT kristofferssonannica novelmethodforestimatingdistancesfromarobottohumansusingegocentricrgbcamera AT repsilberdirk novelmethodforestimatingdistancesfromarobottohumansusingegocentricrgbcamera AT loutfiamy novelmethodforestimatingdistancesfromarobottohumansusingegocentricrgbcamera |