Cargando…

American Sign Language Recognition Using Leap Motion Controller with Machine Learning Approach

Sign language is intentionally designed to allow deaf and dumb communities to convey messages and to connect with society. Unfortunately, learning and practicing sign language is not common among society; hence, this study developed a sign language recognition prototype using the Leap Motion Control...

Descripción completa

Detalles Bibliográficos
Autores principales: Chong, Teak-Wei, Lee, Boon-Giin
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2018
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6210690/
https://www.ncbi.nlm.nih.gov/pubmed/30347776
http://dx.doi.org/10.3390/s18103554
_version_ 1783367173889064960
author Chong, Teak-Wei
Lee, Boon-Giin
author_facet Chong, Teak-Wei
Lee, Boon-Giin
author_sort Chong, Teak-Wei
collection PubMed
description Sign language is intentionally designed to allow deaf and dumb communities to convey messages and to connect with society. Unfortunately, learning and practicing sign language is not common among society; hence, this study developed a sign language recognition prototype using the Leap Motion Controller (LMC). Many existing studies have proposed methods for incomplete sign language recognition, whereas this study aimed for full American Sign Language (ASL) recognition, which consists of 26 letters and 10 digits. Most of the ASL letters are static (no movement), but certain ASL letters are dynamic (they require certain movements). Thus, this study also aimed to extract features from finger and hand motions to differentiate between the static and dynamic gestures. The experimental results revealed that the sign language recognition rates for the 26 letters using a support vector machine (SVM) and a deep neural network (DNN) are 80.30% and 93.81%, respectively. Meanwhile, the recognition rates for a combination of 26 letters and 10 digits are slightly lower, approximately 72.79% for the SVM and 88.79% for the DNN. As a result, the sign language recognition system has great potential for reducing the gap between deaf and dumb communities and others. The proposed prototype could also serve as an interpreter for the deaf and dumb in everyday life in service sectors, such as at the bank or post office.
format Online
Article
Text
id pubmed-6210690
institution National Center for Biotechnology Information
language English
publishDate 2018
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-62106902018-11-02 American Sign Language Recognition Using Leap Motion Controller with Machine Learning Approach Chong, Teak-Wei Lee, Boon-Giin Sensors (Basel) Article Sign language is intentionally designed to allow deaf and dumb communities to convey messages and to connect with society. Unfortunately, learning and practicing sign language is not common among society; hence, this study developed a sign language recognition prototype using the Leap Motion Controller (LMC). Many existing studies have proposed methods for incomplete sign language recognition, whereas this study aimed for full American Sign Language (ASL) recognition, which consists of 26 letters and 10 digits. Most of the ASL letters are static (no movement), but certain ASL letters are dynamic (they require certain movements). Thus, this study also aimed to extract features from finger and hand motions to differentiate between the static and dynamic gestures. The experimental results revealed that the sign language recognition rates for the 26 letters using a support vector machine (SVM) and a deep neural network (DNN) are 80.30% and 93.81%, respectively. Meanwhile, the recognition rates for a combination of 26 letters and 10 digits are slightly lower, approximately 72.79% for the SVM and 88.79% for the DNN. As a result, the sign language recognition system has great potential for reducing the gap between deaf and dumb communities and others. The proposed prototype could also serve as an interpreter for the deaf and dumb in everyday life in service sectors, such as at the bank or post office. MDPI 2018-10-19 /pmc/articles/PMC6210690/ /pubmed/30347776 http://dx.doi.org/10.3390/s18103554 Text en © 2018 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Chong, Teak-Wei
Lee, Boon-Giin
American Sign Language Recognition Using Leap Motion Controller with Machine Learning Approach
title American Sign Language Recognition Using Leap Motion Controller with Machine Learning Approach
title_full American Sign Language Recognition Using Leap Motion Controller with Machine Learning Approach
title_fullStr American Sign Language Recognition Using Leap Motion Controller with Machine Learning Approach
title_full_unstemmed American Sign Language Recognition Using Leap Motion Controller with Machine Learning Approach
title_short American Sign Language Recognition Using Leap Motion Controller with Machine Learning Approach
title_sort american sign language recognition using leap motion controller with machine learning approach
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6210690/
https://www.ncbi.nlm.nih.gov/pubmed/30347776
http://dx.doi.org/10.3390/s18103554
work_keys_str_mv AT chongteakwei americansignlanguagerecognitionusingleapmotioncontrollerwithmachinelearningapproach
AT leeboongiin americansignlanguagerecognitionusingleapmotioncontrollerwithmachinelearningapproach