Cargando…

Gaze Point Tracking Based on a Robotic Body–Head–Eye Coordination Method

When the magnitude of a gaze is too large, human beings change the orientation of their head or body to assist their eyes in tracking targets because saccade alone is insufficient to keep a target at the center region of the retina. To make a robot gaze at targets rapidly and stably (as a human does...

Descripción completa

Detalles Bibliográficos
Autores principales: Feng, Xingyang, Wang, Qingbin, Cong, Hua, Zhang, Yu, Qiu, Mianhao
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10383314/
https://www.ncbi.nlm.nih.gov/pubmed/37514595
http://dx.doi.org/10.3390/s23146299
_version_ 1785080879098363904
author Feng, Xingyang
Wang, Qingbin
Cong, Hua
Zhang, Yu
Qiu, Mianhao
author_facet Feng, Xingyang
Wang, Qingbin
Cong, Hua
Zhang, Yu
Qiu, Mianhao
author_sort Feng, Xingyang
collection PubMed
description When the magnitude of a gaze is too large, human beings change the orientation of their head or body to assist their eyes in tracking targets because saccade alone is insufficient to keep a target at the center region of the retina. To make a robot gaze at targets rapidly and stably (as a human does), it is necessary to design a body–head–eye coordinated motion control strategy. A robot system equipped with eyes and a head is designed in this paper. Gaze point tracking problems are divided into two sub-problems: in situ gaze point tracking and approaching gaze point tracking. In the in situ gaze tracking state, the desired positions of the eye, head and body are calculated on the basis of minimizing resource consumption and maximizing stability. In the approaching gaze point tracking state, the robot is expected to approach the object at a zero angle. In the process of tracking, the three-dimensional (3D) coordinates of the object are obtained by the bionic eye and then converted to the head coordinate system and the mobile robot coordinate system. The desired positions of the head, eyes and body are obtained according to the object’s 3D coordinates. Then, using sophisticated motor control methods, the head, eyes and body are controlled to the desired position. This method avoids the complex process of adjusting control parameters and does not require the design of complex control algorithms. Based on this strategy, in situ gaze point tracking and approaching gaze point tracking experiments are performed by the robot. The experimental results show that body–head–eye coordination gaze point tracking based on the 3D coordinates of an object is feasible. This paper provides a new method that differs from the traditional two-dimensional image-based method for robotic body–head–eye gaze point tracking.
format Online
Article
Text
id pubmed-10383314
institution National Center for Biotechnology Information
language English
publishDate 2023
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-103833142023-07-30 Gaze Point Tracking Based on a Robotic Body–Head–Eye Coordination Method Feng, Xingyang Wang, Qingbin Cong, Hua Zhang, Yu Qiu, Mianhao Sensors (Basel) Article When the magnitude of a gaze is too large, human beings change the orientation of their head or body to assist their eyes in tracking targets because saccade alone is insufficient to keep a target at the center region of the retina. To make a robot gaze at targets rapidly and stably (as a human does), it is necessary to design a body–head–eye coordinated motion control strategy. A robot system equipped with eyes and a head is designed in this paper. Gaze point tracking problems are divided into two sub-problems: in situ gaze point tracking and approaching gaze point tracking. In the in situ gaze tracking state, the desired positions of the eye, head and body are calculated on the basis of minimizing resource consumption and maximizing stability. In the approaching gaze point tracking state, the robot is expected to approach the object at a zero angle. In the process of tracking, the three-dimensional (3D) coordinates of the object are obtained by the bionic eye and then converted to the head coordinate system and the mobile robot coordinate system. The desired positions of the head, eyes and body are obtained according to the object’s 3D coordinates. Then, using sophisticated motor control methods, the head, eyes and body are controlled to the desired position. This method avoids the complex process of adjusting control parameters and does not require the design of complex control algorithms. Based on this strategy, in situ gaze point tracking and approaching gaze point tracking experiments are performed by the robot. The experimental results show that body–head–eye coordination gaze point tracking based on the 3D coordinates of an object is feasible. This paper provides a new method that differs from the traditional two-dimensional image-based method for robotic body–head–eye gaze point tracking. MDPI 2023-07-11 /pmc/articles/PMC10383314/ /pubmed/37514595 http://dx.doi.org/10.3390/s23146299 Text en © 2023 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Feng, Xingyang
Wang, Qingbin
Cong, Hua
Zhang, Yu
Qiu, Mianhao
Gaze Point Tracking Based on a Robotic Body–Head–Eye Coordination Method
title Gaze Point Tracking Based on a Robotic Body–Head–Eye Coordination Method
title_full Gaze Point Tracking Based on a Robotic Body–Head–Eye Coordination Method
title_fullStr Gaze Point Tracking Based on a Robotic Body–Head–Eye Coordination Method
title_full_unstemmed Gaze Point Tracking Based on a Robotic Body–Head–Eye Coordination Method
title_short Gaze Point Tracking Based on a Robotic Body–Head–Eye Coordination Method
title_sort gaze point tracking based on a robotic body–head–eye coordination method
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10383314/
https://www.ncbi.nlm.nih.gov/pubmed/37514595
http://dx.doi.org/10.3390/s23146299
work_keys_str_mv AT fengxingyang gazepointtrackingbasedonaroboticbodyheadeyecoordinationmethod
AT wangqingbin gazepointtrackingbasedonaroboticbodyheadeyecoordinationmethod
AT conghua gazepointtrackingbasedonaroboticbodyheadeyecoordinationmethod
AT zhangyu gazepointtrackingbasedonaroboticbodyheadeyecoordinationmethod
AT qiumianhao gazepointtrackingbasedonaroboticbodyheadeyecoordinationmethod