Cargando…
A Novel Zernike Moment-Based Real-Time Head Pose and Gaze Estimation Framework for Accuracy-Sensitive Applications
A real-time head pose and gaze estimation (HPGE) algorithm has excellent potential for technological advancements either in human–machine or human–robot interactions. For example, in high-accuracy advent applications such as Driver’s Assistance System (DAS), HPGE plays a crucial role in omitting acc...
Autores principales: | , , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
MDPI
2022
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9658879/ https://www.ncbi.nlm.nih.gov/pubmed/36366147 http://dx.doi.org/10.3390/s22218449 |
_version_ | 1784830063063072768 |
---|---|
author | Vankayalapati, Hima Deepthi Kuchibhotla, Swarna Chadalavada, Mohan Sai Kumar Dargar, Shashi Kant Anne, Koteswara Rao Kyandoghere, Kyamakya |
author_facet | Vankayalapati, Hima Deepthi Kuchibhotla, Swarna Chadalavada, Mohan Sai Kumar Dargar, Shashi Kant Anne, Koteswara Rao Kyandoghere, Kyamakya |
author_sort | Vankayalapati, Hima Deepthi |
collection | PubMed |
description | A real-time head pose and gaze estimation (HPGE) algorithm has excellent potential for technological advancements either in human–machine or human–robot interactions. For example, in high-accuracy advent applications such as Driver’s Assistance System (DAS), HPGE plays a crucial role in omitting accidents and road hazards. In this paper, the authors propose a new hybrid framework for improved estimation by combining both the appearance and geometric-based conventional methods to extract local and global features. Therefore, the Zernike moments algorithm has been prominent in extracting rotation, scale, and illumination invariant features. Later, conventional discriminant algorithms were used to classify the head poses and gaze direction. Furthermore, the experiments were performed on standard datasets and real-time images to analyze the accuracy of the proposed algorithm. As a result, the proposed framework has immediately estimated the range of direction changes under different illumination conditions. We obtained an accuracy of ~85%; the average response time was 21.52 and 7.483 ms for estimating head poses and gaze, respectively, independent of illumination, background, and occlusion. The proposed method is promising for future developments of a robust system that is invariant even to blurring conditions and thus reaching much more significant performance enhancement. |
format | Online Article Text |
id | pubmed-9658879 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2022 |
publisher | MDPI |
record_format | MEDLINE/PubMed |
spelling | pubmed-96588792022-11-15 A Novel Zernike Moment-Based Real-Time Head Pose and Gaze Estimation Framework for Accuracy-Sensitive Applications Vankayalapati, Hima Deepthi Kuchibhotla, Swarna Chadalavada, Mohan Sai Kumar Dargar, Shashi Kant Anne, Koteswara Rao Kyandoghere, Kyamakya Sensors (Basel) Article A real-time head pose and gaze estimation (HPGE) algorithm has excellent potential for technological advancements either in human–machine or human–robot interactions. For example, in high-accuracy advent applications such as Driver’s Assistance System (DAS), HPGE plays a crucial role in omitting accidents and road hazards. In this paper, the authors propose a new hybrid framework for improved estimation by combining both the appearance and geometric-based conventional methods to extract local and global features. Therefore, the Zernike moments algorithm has been prominent in extracting rotation, scale, and illumination invariant features. Later, conventional discriminant algorithms were used to classify the head poses and gaze direction. Furthermore, the experiments were performed on standard datasets and real-time images to analyze the accuracy of the proposed algorithm. As a result, the proposed framework has immediately estimated the range of direction changes under different illumination conditions. We obtained an accuracy of ~85%; the average response time was 21.52 and 7.483 ms for estimating head poses and gaze, respectively, independent of illumination, background, and occlusion. The proposed method is promising for future developments of a robust system that is invariant even to blurring conditions and thus reaching much more significant performance enhancement. MDPI 2022-11-03 /pmc/articles/PMC9658879/ /pubmed/36366147 http://dx.doi.org/10.3390/s22218449 Text en © 2022 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). |
spellingShingle | Article Vankayalapati, Hima Deepthi Kuchibhotla, Swarna Chadalavada, Mohan Sai Kumar Dargar, Shashi Kant Anne, Koteswara Rao Kyandoghere, Kyamakya A Novel Zernike Moment-Based Real-Time Head Pose and Gaze Estimation Framework for Accuracy-Sensitive Applications |
title | A Novel Zernike Moment-Based Real-Time Head Pose and Gaze Estimation Framework for Accuracy-Sensitive Applications |
title_full | A Novel Zernike Moment-Based Real-Time Head Pose and Gaze Estimation Framework for Accuracy-Sensitive Applications |
title_fullStr | A Novel Zernike Moment-Based Real-Time Head Pose and Gaze Estimation Framework for Accuracy-Sensitive Applications |
title_full_unstemmed | A Novel Zernike Moment-Based Real-Time Head Pose and Gaze Estimation Framework for Accuracy-Sensitive Applications |
title_short | A Novel Zernike Moment-Based Real-Time Head Pose and Gaze Estimation Framework for Accuracy-Sensitive Applications |
title_sort | novel zernike moment-based real-time head pose and gaze estimation framework for accuracy-sensitive applications |
topic | Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9658879/ https://www.ncbi.nlm.nih.gov/pubmed/36366147 http://dx.doi.org/10.3390/s22218449 |
work_keys_str_mv | AT vankayalapatihimadeepthi anovelzernikemomentbasedrealtimeheadposeandgazeestimationframeworkforaccuracysensitiveapplications AT kuchibhotlaswarna anovelzernikemomentbasedrealtimeheadposeandgazeestimationframeworkforaccuracysensitiveapplications AT chadalavadamohansaikumar anovelzernikemomentbasedrealtimeheadposeandgazeestimationframeworkforaccuracysensitiveapplications AT dargarshashikant anovelzernikemomentbasedrealtimeheadposeandgazeestimationframeworkforaccuracysensitiveapplications AT annekoteswararao anovelzernikemomentbasedrealtimeheadposeandgazeestimationframeworkforaccuracysensitiveapplications AT kyandogherekyamakya anovelzernikemomentbasedrealtimeheadposeandgazeestimationframeworkforaccuracysensitiveapplications AT vankayalapatihimadeepthi novelzernikemomentbasedrealtimeheadposeandgazeestimationframeworkforaccuracysensitiveapplications AT kuchibhotlaswarna novelzernikemomentbasedrealtimeheadposeandgazeestimationframeworkforaccuracysensitiveapplications AT chadalavadamohansaikumar novelzernikemomentbasedrealtimeheadposeandgazeestimationframeworkforaccuracysensitiveapplications AT dargarshashikant novelzernikemomentbasedrealtimeheadposeandgazeestimationframeworkforaccuracysensitiveapplications AT annekoteswararao novelzernikemomentbasedrealtimeheadposeandgazeestimationframeworkforaccuracysensitiveapplications AT kyandogherekyamakya novelzernikemomentbasedrealtimeheadposeandgazeestimationframeworkforaccuracysensitiveapplications |