Cargando…

New combined DT-CWT and HOG descriptor for static and dynamic hand gesture recognition

In recent years, researchers have been focusing on developing Human-Computer Interfaces that are fast, intuitive, and allow direct interaction with the computing environment. One of the most natural ways of communication is hand gestures. In this context, many systems were developed to recognize han...

Descripción completa

Detalles Bibliográficos
Autores principales: Agab, Salah Eddine, Chelali, Fatma Zohra
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Springer US 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9887237/
https://www.ncbi.nlm.nih.gov/pubmed/36743996
http://dx.doi.org/10.1007/s11042-023-14433-x
_version_ 1784880296682848256
author Agab, Salah Eddine
Chelali, Fatma Zohra
author_facet Agab, Salah Eddine
Chelali, Fatma Zohra
author_sort Agab, Salah Eddine
collection PubMed
description In recent years, researchers have been focusing on developing Human-Computer Interfaces that are fast, intuitive, and allow direct interaction with the computing environment. One of the most natural ways of communication is hand gestures. In this context, many systems were developed to recognize hand gestures using numerous vision-based techniques, these systems are highly affected by acquisition constraints, such as resolution, noise, lighting condition, hand shape, and pose. To enhance the performance under such constraints, we propose a static and dynamic hand gesture recognition system, which utilizes the Dual-Tree Complex Wavelet Transform to produce an approximation image characterized by less noise and redundancy. Subsequently, the Histogram of Oriented Gradients is applied to the resulting image to extract relevant information and produce a compact features vector. For classification, we compare the performance of three Artificial Neural Networks, namely, MLP, PNN, and RBNN. Random Decision Forest and SVM classifiers are also used to ameliorate the efficiency of our system. Experimental evaluation is performed on four datasets composed of alphabet signs and dynamic gestures. The obtained results demonstrate the efficiency of the combined features, for which the achieved recognition rates were comparable to the state-of-the-art.
format Online
Article
Text
id pubmed-9887237
institution National Center for Biotechnology Information
language English
publishDate 2023
publisher Springer US
record_format MEDLINE/PubMed
spelling pubmed-98872372023-01-31 New combined DT-CWT and HOG descriptor for static and dynamic hand gesture recognition Agab, Salah Eddine Chelali, Fatma Zohra Multimed Tools Appl Article In recent years, researchers have been focusing on developing Human-Computer Interfaces that are fast, intuitive, and allow direct interaction with the computing environment. One of the most natural ways of communication is hand gestures. In this context, many systems were developed to recognize hand gestures using numerous vision-based techniques, these systems are highly affected by acquisition constraints, such as resolution, noise, lighting condition, hand shape, and pose. To enhance the performance under such constraints, we propose a static and dynamic hand gesture recognition system, which utilizes the Dual-Tree Complex Wavelet Transform to produce an approximation image characterized by less noise and redundancy. Subsequently, the Histogram of Oriented Gradients is applied to the resulting image to extract relevant information and produce a compact features vector. For classification, we compare the performance of three Artificial Neural Networks, namely, MLP, PNN, and RBNN. Random Decision Forest and SVM classifiers are also used to ameliorate the efficiency of our system. Experimental evaluation is performed on four datasets composed of alphabet signs and dynamic gestures. The obtained results demonstrate the efficiency of the combined features, for which the achieved recognition rates were comparable to the state-of-the-art. Springer US 2023-01-31 /pmc/articles/PMC9887237/ /pubmed/36743996 http://dx.doi.org/10.1007/s11042-023-14433-x Text en © The Author(s), under exclusive licence to Springer Science+Business Media, LLC, part of Springer Nature 2023, Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law. This article is made available via the PMC Open Access Subset for unrestricted research re-use and secondary analysis in any form or by any means with acknowledgement of the original source. These permissions are granted for the duration of the World Health Organization (WHO) declaration of COVID-19 as a global pandemic.
spellingShingle Article
Agab, Salah Eddine
Chelali, Fatma Zohra
New combined DT-CWT and HOG descriptor for static and dynamic hand gesture recognition
title New combined DT-CWT and HOG descriptor for static and dynamic hand gesture recognition
title_full New combined DT-CWT and HOG descriptor for static and dynamic hand gesture recognition
title_fullStr New combined DT-CWT and HOG descriptor for static and dynamic hand gesture recognition
title_full_unstemmed New combined DT-CWT and HOG descriptor for static and dynamic hand gesture recognition
title_short New combined DT-CWT and HOG descriptor for static and dynamic hand gesture recognition
title_sort new combined dt-cwt and hog descriptor for static and dynamic hand gesture recognition
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9887237/
https://www.ncbi.nlm.nih.gov/pubmed/36743996
http://dx.doi.org/10.1007/s11042-023-14433-x
work_keys_str_mv AT agabsalaheddine newcombineddtcwtandhogdescriptorforstaticanddynamichandgesturerecognition
AT chelalifatmazohra newcombineddtcwtandhogdescriptorforstaticanddynamichandgesturerecognition