Cargando…

Evaluation of 2D-/3D-Feet-Detection Methods for Semi-Autonomous Powered Wheelchair Navigation

Powered wheelchairs have enhanced the mobility and quality of life of people with special needs. The next step in the development of powered wheelchairs is to incorporate sensors and electronic systems for new control applications and capabilities to improve their usability and the safety of their o...

Descripción completa

Detalles Bibliográficos
Autores principales: Giménez, Cristian Vilar, Krug, Silvia, Qureshi, Faisal Z., O’Nils, Mattias
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2021
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8706737/
https://www.ncbi.nlm.nih.gov/pubmed/34940723
http://dx.doi.org/10.3390/jimaging7120255
_version_ 1784622266118569984
author Giménez, Cristian Vilar
Krug, Silvia
Qureshi, Faisal Z.
O’Nils, Mattias
author_facet Giménez, Cristian Vilar
Krug, Silvia
Qureshi, Faisal Z.
O’Nils, Mattias
author_sort Giménez, Cristian Vilar
collection PubMed
description Powered wheelchairs have enhanced the mobility and quality of life of people with special needs. The next step in the development of powered wheelchairs is to incorporate sensors and electronic systems for new control applications and capabilities to improve their usability and the safety of their operation, such as obstacle avoidance or autonomous driving. However, autonomous powered wheelchairs require safe navigation in different environments and scenarios, making their development complex. In our research, we propose, instead, to develop contactless control for powered wheelchairs where the position of the caregiver is used as a control reference. Hence, we used a depth camera to recognize the caregiver and measure at the same time their relative distance from the powered wheelchair. In this paper, we compared two different approaches for real-time object recognition using a 3DHOG hand-crafted object descriptor based on a 3D extension of the histogram of oriented gradients (HOG) and a convolutional neural network based on YOLOv4-Tiny. To evaluate both approaches, we constructed Miun-Feet—a custom dataset of images of labeled caregiver’s feet in different scenarios, with backgrounds, objects, and lighting conditions. The experimental results showed that the YOLOv4-Tiny approach outperformed 3DHOG in all the analyzed cases. In addition, the results showed that the recognition accuracy was not improved using the depth channel, enabling the use of a monocular RGB camera only instead of a depth camera and reducing the computational cost and heat dissipation limitations. Hence, the paper proposes an additional method to compute the caregiver’s distance and angle from the Powered Wheelchair (PW) using only the RGB data. This work shows that it is feasible to use the location of the caregiver’s feet as a control signal for the control of a powered wheelchair and that it is possible to use a monocular RGB camera to compute their relative positions.
format Online
Article
Text
id pubmed-8706737
institution National Center for Biotechnology Information
language English
publishDate 2021
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-87067372021-12-25 Evaluation of 2D-/3D-Feet-Detection Methods for Semi-Autonomous Powered Wheelchair Navigation Giménez, Cristian Vilar Krug, Silvia Qureshi, Faisal Z. O’Nils, Mattias J Imaging Article Powered wheelchairs have enhanced the mobility and quality of life of people with special needs. The next step in the development of powered wheelchairs is to incorporate sensors and electronic systems for new control applications and capabilities to improve their usability and the safety of their operation, such as obstacle avoidance or autonomous driving. However, autonomous powered wheelchairs require safe navigation in different environments and scenarios, making their development complex. In our research, we propose, instead, to develop contactless control for powered wheelchairs where the position of the caregiver is used as a control reference. Hence, we used a depth camera to recognize the caregiver and measure at the same time their relative distance from the powered wheelchair. In this paper, we compared two different approaches for real-time object recognition using a 3DHOG hand-crafted object descriptor based on a 3D extension of the histogram of oriented gradients (HOG) and a convolutional neural network based on YOLOv4-Tiny. To evaluate both approaches, we constructed Miun-Feet—a custom dataset of images of labeled caregiver’s feet in different scenarios, with backgrounds, objects, and lighting conditions. The experimental results showed that the YOLOv4-Tiny approach outperformed 3DHOG in all the analyzed cases. In addition, the results showed that the recognition accuracy was not improved using the depth channel, enabling the use of a monocular RGB camera only instead of a depth camera and reducing the computational cost and heat dissipation limitations. Hence, the paper proposes an additional method to compute the caregiver’s distance and angle from the Powered Wheelchair (PW) using only the RGB data. This work shows that it is feasible to use the location of the caregiver’s feet as a control signal for the control of a powered wheelchair and that it is possible to use a monocular RGB camera to compute their relative positions. MDPI 2021-11-30 /pmc/articles/PMC8706737/ /pubmed/34940723 http://dx.doi.org/10.3390/jimaging7120255 Text en © 2021 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Giménez, Cristian Vilar
Krug, Silvia
Qureshi, Faisal Z.
O’Nils, Mattias
Evaluation of 2D-/3D-Feet-Detection Methods for Semi-Autonomous Powered Wheelchair Navigation
title Evaluation of 2D-/3D-Feet-Detection Methods for Semi-Autonomous Powered Wheelchair Navigation
title_full Evaluation of 2D-/3D-Feet-Detection Methods for Semi-Autonomous Powered Wheelchair Navigation
title_fullStr Evaluation of 2D-/3D-Feet-Detection Methods for Semi-Autonomous Powered Wheelchair Navigation
title_full_unstemmed Evaluation of 2D-/3D-Feet-Detection Methods for Semi-Autonomous Powered Wheelchair Navigation
title_short Evaluation of 2D-/3D-Feet-Detection Methods for Semi-Autonomous Powered Wheelchair Navigation
title_sort evaluation of 2d-/3d-feet-detection methods for semi-autonomous powered wheelchair navigation
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8706737/
https://www.ncbi.nlm.nih.gov/pubmed/34940723
http://dx.doi.org/10.3390/jimaging7120255
work_keys_str_mv AT gimenezcristianvilar evaluationof2d3dfeetdetectionmethodsforsemiautonomouspoweredwheelchairnavigation
AT krugsilvia evaluationof2d3dfeetdetectionmethodsforsemiautonomouspoweredwheelchairnavigation
AT qureshifaisalz evaluationof2d3dfeetdetectionmethodsforsemiautonomouspoweredwheelchairnavigation
AT onilsmattias evaluationof2d3dfeetdetectionmethodsforsemiautonomouspoweredwheelchairnavigation