Cargando…

User-Independent Hand Gesture Recognition Classification Models Using Sensor Fusion

Recently, it has been proven that targeting motor impairments as early as possible while using wearable mechatronic devices for assisted therapy can improve rehabilitation outcomes. However, despite the advanced progress on control methods for wearable mechatronic devices, the need for a more natura...

Descripción completa

Detalles Bibliográficos
Autores principales: Colli Alfaro, Jose Guillermo, Trejos, Ana Luisa
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8963034/
https://www.ncbi.nlm.nih.gov/pubmed/35214223
http://dx.doi.org/10.3390/s22041321
_version_ 1784677905656184832
author Colli Alfaro, Jose Guillermo
Trejos, Ana Luisa
author_facet Colli Alfaro, Jose Guillermo
Trejos, Ana Luisa
author_sort Colli Alfaro, Jose Guillermo
collection PubMed
description Recently, it has been proven that targeting motor impairments as early as possible while using wearable mechatronic devices for assisted therapy can improve rehabilitation outcomes. However, despite the advanced progress on control methods for wearable mechatronic devices, the need for a more natural interface that allows for better control remains. To address this issue, electromyography (EMG)-based gesture recognition systems have been studied as a potential solution for human–machine interface applications. Recent studies have focused on developing user-independent gesture recognition interfaces to reduce calibration times for new users. Unfortunately, given the stochastic nature of EMG signals, the performance of these interfaces is negatively impacted. To address this issue, this work presents a user-independent gesture classification method based on a sensor fusion technique that combines EMG data and inertial measurement unit (IMU) data. The Myo Armband was used to measure muscle activity and motion data from healthy subjects. Participants were asked to perform seven types of gestures in four different arm positions while using the Myo on their dominant limb. Data obtained from 22 participants were used to classify the gestures using three different classification methods. Overall, average classification accuracies in the range of 67.5–84.6% were obtained, with the Adaptive Least-Squares Support Vector Machine model obtaining accuracies as high as 92.9%. These results suggest that by using the proposed sensor fusion approach, it is possible to achieve a more natural interface that allows better control of wearable mechatronic devices during robot assisted therapies.
format Online
Article
Text
id pubmed-8963034
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-89630342022-03-30 User-Independent Hand Gesture Recognition Classification Models Using Sensor Fusion Colli Alfaro, Jose Guillermo Trejos, Ana Luisa Sensors (Basel) Article Recently, it has been proven that targeting motor impairments as early as possible while using wearable mechatronic devices for assisted therapy can improve rehabilitation outcomes. However, despite the advanced progress on control methods for wearable mechatronic devices, the need for a more natural interface that allows for better control remains. To address this issue, electromyography (EMG)-based gesture recognition systems have been studied as a potential solution for human–machine interface applications. Recent studies have focused on developing user-independent gesture recognition interfaces to reduce calibration times for new users. Unfortunately, given the stochastic nature of EMG signals, the performance of these interfaces is negatively impacted. To address this issue, this work presents a user-independent gesture classification method based on a sensor fusion technique that combines EMG data and inertial measurement unit (IMU) data. The Myo Armband was used to measure muscle activity and motion data from healthy subjects. Participants were asked to perform seven types of gestures in four different arm positions while using the Myo on their dominant limb. Data obtained from 22 participants were used to classify the gestures using three different classification methods. Overall, average classification accuracies in the range of 67.5–84.6% were obtained, with the Adaptive Least-Squares Support Vector Machine model obtaining accuracies as high as 92.9%. These results suggest that by using the proposed sensor fusion approach, it is possible to achieve a more natural interface that allows better control of wearable mechatronic devices during robot assisted therapies. MDPI 2022-02-09 /pmc/articles/PMC8963034/ /pubmed/35214223 http://dx.doi.org/10.3390/s22041321 Text en © 2022 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Colli Alfaro, Jose Guillermo
Trejos, Ana Luisa
User-Independent Hand Gesture Recognition Classification Models Using Sensor Fusion
title User-Independent Hand Gesture Recognition Classification Models Using Sensor Fusion
title_full User-Independent Hand Gesture Recognition Classification Models Using Sensor Fusion
title_fullStr User-Independent Hand Gesture Recognition Classification Models Using Sensor Fusion
title_full_unstemmed User-Independent Hand Gesture Recognition Classification Models Using Sensor Fusion
title_short User-Independent Hand Gesture Recognition Classification Models Using Sensor Fusion
title_sort user-independent hand gesture recognition classification models using sensor fusion
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8963034/
https://www.ncbi.nlm.nih.gov/pubmed/35214223
http://dx.doi.org/10.3390/s22041321
work_keys_str_mv AT collialfarojoseguillermo userindependenthandgesturerecognitionclassificationmodelsusingsensorfusion
AT trejosanaluisa userindependenthandgesturerecognitionclassificationmodelsusingsensorfusion