Cargando…

Towards Robust Robot Control in Cartesian Space Using an Infrastructureless Head- and Eye-Gaze Interface

This paper presents a lightweight, infrastructureless head-worn interface for robust and real-time robot control in Cartesian space using head- and eye-gaze. The interface comes at a total weight of just 162 g. It combines a state-of-the-art visual simultaneous localization and mapping algorithm (OR...

Descripción completa

Detalles Bibliográficos
Autores principales: Wöhle, Lukas, Gebhard, Marion
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2021
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7962065/
https://www.ncbi.nlm.nih.gov/pubmed/33807599
http://dx.doi.org/10.3390/s21051798
_version_ 1783665396802387968
author Wöhle, Lukas
Gebhard, Marion
author_facet Wöhle, Lukas
Gebhard, Marion
author_sort Wöhle, Lukas
collection PubMed
description This paper presents a lightweight, infrastructureless head-worn interface for robust and real-time robot control in Cartesian space using head- and eye-gaze. The interface comes at a total weight of just 162 g. It combines a state-of-the-art visual simultaneous localization and mapping algorithm (ORB-SLAM 2) for RGB-D cameras with a Magnetic Angular rate Gravity (MARG)-sensor filter. The data fusion process is designed to dynamically switch between magnetic, inertial and visual heading sources to enable robust orientation estimation under various disturbances, e.g., magnetic disturbances or degraded visual sensor data. The interface furthermore delivers accurate eye- and head-gaze vectors to enable precise robot end effector (EFF) positioning and employs a head motion mapping technique to effectively control the robots end effector orientation. An experimental proof of concept demonstrates that the proposed interface and its data fusion process generate reliable and robust pose estimation. The three-dimensional head- and eye-gaze position estimation pipeline delivers a mean Euclidean error of [Formula: see text] mm for head-gaze and [Formula: see text] mm for eye-gaze at a distance of 0.3–1.1 m to the user. This indicates that the proposed interface offers a precise control mechanism for hands-free and full six degree of freedom (DoF) robot teleoperation in Cartesian space by head- or eye-gaze and head motion.
format Online
Article
Text
id pubmed-7962065
institution National Center for Biotechnology Information
language English
publishDate 2021
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-79620652021-03-17 Towards Robust Robot Control in Cartesian Space Using an Infrastructureless Head- and Eye-Gaze Interface Wöhle, Lukas Gebhard, Marion Sensors (Basel) Article This paper presents a lightweight, infrastructureless head-worn interface for robust and real-time robot control in Cartesian space using head- and eye-gaze. The interface comes at a total weight of just 162 g. It combines a state-of-the-art visual simultaneous localization and mapping algorithm (ORB-SLAM 2) for RGB-D cameras with a Magnetic Angular rate Gravity (MARG)-sensor filter. The data fusion process is designed to dynamically switch between magnetic, inertial and visual heading sources to enable robust orientation estimation under various disturbances, e.g., magnetic disturbances or degraded visual sensor data. The interface furthermore delivers accurate eye- and head-gaze vectors to enable precise robot end effector (EFF) positioning and employs a head motion mapping technique to effectively control the robots end effector orientation. An experimental proof of concept demonstrates that the proposed interface and its data fusion process generate reliable and robust pose estimation. The three-dimensional head- and eye-gaze position estimation pipeline delivers a mean Euclidean error of [Formula: see text] mm for head-gaze and [Formula: see text] mm for eye-gaze at a distance of 0.3–1.1 m to the user. This indicates that the proposed interface offers a precise control mechanism for hands-free and full six degree of freedom (DoF) robot teleoperation in Cartesian space by head- or eye-gaze and head motion. MDPI 2021-03-05 /pmc/articles/PMC7962065/ /pubmed/33807599 http://dx.doi.org/10.3390/s21051798 Text en © 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Wöhle, Lukas
Gebhard, Marion
Towards Robust Robot Control in Cartesian Space Using an Infrastructureless Head- and Eye-Gaze Interface
title Towards Robust Robot Control in Cartesian Space Using an Infrastructureless Head- and Eye-Gaze Interface
title_full Towards Robust Robot Control in Cartesian Space Using an Infrastructureless Head- and Eye-Gaze Interface
title_fullStr Towards Robust Robot Control in Cartesian Space Using an Infrastructureless Head- and Eye-Gaze Interface
title_full_unstemmed Towards Robust Robot Control in Cartesian Space Using an Infrastructureless Head- and Eye-Gaze Interface
title_short Towards Robust Robot Control in Cartesian Space Using an Infrastructureless Head- and Eye-Gaze Interface
title_sort towards robust robot control in cartesian space using an infrastructureless head- and eye-gaze interface
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7962065/
https://www.ncbi.nlm.nih.gov/pubmed/33807599
http://dx.doi.org/10.3390/s21051798
work_keys_str_mv AT wohlelukas towardsrobustrobotcontrolincartesianspaceusinganinfrastructurelessheadandeyegazeinterface
AT gebhardmarion towardsrobustrobotcontrolincartesianspaceusinganinfrastructurelessheadandeyegazeinterface