Cargando…

American Sign Language Translation Using Wearable Inertial and Electromyography Sensors for Tracking Hand Movements and Facial Expressions

A sign language translation system can break the communication barrier between hearing-impaired people and others. In this paper, a novel American sign language (ASL) translation method based on wearable sensors was proposed. We leveraged inertial sensors to capture signs and surface electromyograph...

Descripción completa

Detalles Bibliográficos
Autores principales: Gu, Yutong, Zheng, Chao, Todoh, Masahiro, Zha, Fusheng
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Frontiers Media S.A. 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9345758/
https://www.ncbi.nlm.nih.gov/pubmed/35937881
http://dx.doi.org/10.3389/fnins.2022.962141
_version_ 1784761501392830464
author Gu, Yutong
Zheng, Chao
Todoh, Masahiro
Zha, Fusheng
author_facet Gu, Yutong
Zheng, Chao
Todoh, Masahiro
Zha, Fusheng
author_sort Gu, Yutong
collection PubMed
description A sign language translation system can break the communication barrier between hearing-impaired people and others. In this paper, a novel American sign language (ASL) translation method based on wearable sensors was proposed. We leveraged inertial sensors to capture signs and surface electromyography (EMG) sensors to detect facial expressions. We applied a convolutional neural network (CNN) to extract features from input signals. Then, long short-term memory (LSTM) and transformer models were exploited to achieve end-to-end translation from input signals to text sentences. We evaluated two models on 40 ASL sentences strictly following the rules of grammar. Word error rate (WER) and sentence error rate (SER) are utilized as the evaluation standard. The LSTM model can translate sentences in the testing dataset with a 7.74% WER and 9.17% SER. The transformer model performs much better by achieving a 4.22% WER and 4.72% SER. The encouraging results indicate that both models are suitable for sign language translation with high accuracy. With complete motion capture sensors and facial expression recognition methods, the sign language translation system has the potential to recognize more sentences.
format Online
Article
Text
id pubmed-9345758
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher Frontiers Media S.A.
record_format MEDLINE/PubMed
spelling pubmed-93457582022-08-04 American Sign Language Translation Using Wearable Inertial and Electromyography Sensors for Tracking Hand Movements and Facial Expressions Gu, Yutong Zheng, Chao Todoh, Masahiro Zha, Fusheng Front Neurosci Neuroscience A sign language translation system can break the communication barrier between hearing-impaired people and others. In this paper, a novel American sign language (ASL) translation method based on wearable sensors was proposed. We leveraged inertial sensors to capture signs and surface electromyography (EMG) sensors to detect facial expressions. We applied a convolutional neural network (CNN) to extract features from input signals. Then, long short-term memory (LSTM) and transformer models were exploited to achieve end-to-end translation from input signals to text sentences. We evaluated two models on 40 ASL sentences strictly following the rules of grammar. Word error rate (WER) and sentence error rate (SER) are utilized as the evaluation standard. The LSTM model can translate sentences in the testing dataset with a 7.74% WER and 9.17% SER. The transformer model performs much better by achieving a 4.22% WER and 4.72% SER. The encouraging results indicate that both models are suitable for sign language translation with high accuracy. With complete motion capture sensors and facial expression recognition methods, the sign language translation system has the potential to recognize more sentences. Frontiers Media S.A. 2022-07-19 /pmc/articles/PMC9345758/ /pubmed/35937881 http://dx.doi.org/10.3389/fnins.2022.962141 Text en Copyright © 2022 Gu, Zheng, Todoh and Zha. https://creativecommons.org/licenses/by/4.0/This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
spellingShingle Neuroscience
Gu, Yutong
Zheng, Chao
Todoh, Masahiro
Zha, Fusheng
American Sign Language Translation Using Wearable Inertial and Electromyography Sensors for Tracking Hand Movements and Facial Expressions
title American Sign Language Translation Using Wearable Inertial and Electromyography Sensors for Tracking Hand Movements and Facial Expressions
title_full American Sign Language Translation Using Wearable Inertial and Electromyography Sensors for Tracking Hand Movements and Facial Expressions
title_fullStr American Sign Language Translation Using Wearable Inertial and Electromyography Sensors for Tracking Hand Movements and Facial Expressions
title_full_unstemmed American Sign Language Translation Using Wearable Inertial and Electromyography Sensors for Tracking Hand Movements and Facial Expressions
title_short American Sign Language Translation Using Wearable Inertial and Electromyography Sensors for Tracking Hand Movements and Facial Expressions
title_sort american sign language translation using wearable inertial and electromyography sensors for tracking hand movements and facial expressions
topic Neuroscience
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9345758/
https://www.ncbi.nlm.nih.gov/pubmed/35937881
http://dx.doi.org/10.3389/fnins.2022.962141
work_keys_str_mv AT guyutong americansignlanguagetranslationusingwearableinertialandelectromyographysensorsfortrackinghandmovementsandfacialexpressions
AT zhengchao americansignlanguagetranslationusingwearableinertialandelectromyographysensorsfortrackinghandmovementsandfacialexpressions
AT todohmasahiro americansignlanguagetranslationusingwearableinertialandelectromyographysensorsfortrackinghandmovementsandfacialexpressions
AT zhafusheng americansignlanguagetranslationusingwearableinertialandelectromyographysensorsfortrackinghandmovementsandfacialexpressions