Cargando…
Upper-Limb Motion Recognition Based on Hybrid Feature Selection: Algorithm Development and Validation
BACKGROUND: For rehabilitation training systems, it is essential to automatically record and recognize exercises, especially when more than one type of exercise is performed without a predefined sequence. Most motion recognition methods are based on feature engineering and machine learning algorithm...
Autores principales: | , , , , , , , , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
JMIR Publications
2021
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8446846/ https://www.ncbi.nlm.nih.gov/pubmed/34473067 http://dx.doi.org/10.2196/24402 |
_version_ | 1784568967858225152 |
---|---|
author | Li, Qiaoqin Liu, Yongguo Zhu, Jiajing Chen, Zhi Liu, Lang Yang, Shangming Zhu, Guanyi Zhu, Bin Li, Juan Jin, Rongjiang Tao, Jing Chen, Lidian |
author_facet | Li, Qiaoqin Liu, Yongguo Zhu, Jiajing Chen, Zhi Liu, Lang Yang, Shangming Zhu, Guanyi Zhu, Bin Li, Juan Jin, Rongjiang Tao, Jing Chen, Lidian |
author_sort | Li, Qiaoqin |
collection | PubMed |
description | BACKGROUND: For rehabilitation training systems, it is essential to automatically record and recognize exercises, especially when more than one type of exercise is performed without a predefined sequence. Most motion recognition methods are based on feature engineering and machine learning algorithms. Time-domain and frequency-domain features are extracted from original time series data collected by sensor nodes. For high-dimensional data, feature selection plays an important role in improving the performance of motion recognition. Existing feature selection methods can be categorized into filter and wrapper methods. Wrapper methods usually achieve better performance than filter methods; however, in most cases, they are computationally intensive, and the feature subset obtained is usually optimized only for the specific learning algorithm. OBJECTIVE: This study aimed to provide a feature selection method for motion recognition of upper-limb exercises and improve the recognition performance. METHODS: Motion data from 5 types of upper-limb exercises performed by 21 participants were collected by a customized inertial measurement unit (IMU) node. A total of 60 time-domain and frequency-domain features were extracted from the original sensor data. A hybrid feature selection method by combining filter and wrapper methods (FESCOM) was proposed to eliminate irrelevant features for motion recognition of upper-limb exercises. In the filter stage, candidate features were first selected from the original feature set according to the significance for motion recognition. In the wrapper stage, k-nearest neighbors (kNN), Naïve Bayes (NB), and random forest (RF) were evaluated as the wrapping components to further refine the features from the candidate feature set. The performance of the proposed FESCOM method was verified using experiments on motion recognition of upper-limb exercises and compared with the traditional wrapper method. RESULTS: Using kNN, NB, and RF as the wrapping components, the classification error rates of the proposed FESCOM method were 1.7%, 8.9%, and 7.4%, respectively, and the feature selection time in each iteration was 13 seconds, 71 seconds, and 541 seconds, respectively. CONCLUSIONS: The experimental results demonstrated that, in the case of 5 motion types performed by 21 healthy participants, the proposed FESCOM method using kNN and NB as the wrapping components achieved better recognition performance than the traditional wrapper method. The FESCOM method dramatically reduces the search time in the feature selection process. The results also demonstrated that the optimal number of features depends on the classifier. This approach serves to improve feature selection and classification algorithm selection for upper-limb motion recognition based on wearable sensor data, which can be extended to motion recognition of more motion types and participants. |
format | Online Article Text |
id | pubmed-8446846 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2021 |
publisher | JMIR Publications |
record_format | MEDLINE/PubMed |
spelling | pubmed-84468462021-10-06 Upper-Limb Motion Recognition Based on Hybrid Feature Selection: Algorithm Development and Validation Li, Qiaoqin Liu, Yongguo Zhu, Jiajing Chen, Zhi Liu, Lang Yang, Shangming Zhu, Guanyi Zhu, Bin Li, Juan Jin, Rongjiang Tao, Jing Chen, Lidian JMIR Mhealth Uhealth Original Paper BACKGROUND: For rehabilitation training systems, it is essential to automatically record and recognize exercises, especially when more than one type of exercise is performed without a predefined sequence. Most motion recognition methods are based on feature engineering and machine learning algorithms. Time-domain and frequency-domain features are extracted from original time series data collected by sensor nodes. For high-dimensional data, feature selection plays an important role in improving the performance of motion recognition. Existing feature selection methods can be categorized into filter and wrapper methods. Wrapper methods usually achieve better performance than filter methods; however, in most cases, they are computationally intensive, and the feature subset obtained is usually optimized only for the specific learning algorithm. OBJECTIVE: This study aimed to provide a feature selection method for motion recognition of upper-limb exercises and improve the recognition performance. METHODS: Motion data from 5 types of upper-limb exercises performed by 21 participants were collected by a customized inertial measurement unit (IMU) node. A total of 60 time-domain and frequency-domain features were extracted from the original sensor data. A hybrid feature selection method by combining filter and wrapper methods (FESCOM) was proposed to eliminate irrelevant features for motion recognition of upper-limb exercises. In the filter stage, candidate features were first selected from the original feature set according to the significance for motion recognition. In the wrapper stage, k-nearest neighbors (kNN), Naïve Bayes (NB), and random forest (RF) were evaluated as the wrapping components to further refine the features from the candidate feature set. The performance of the proposed FESCOM method was verified using experiments on motion recognition of upper-limb exercises and compared with the traditional wrapper method. RESULTS: Using kNN, NB, and RF as the wrapping components, the classification error rates of the proposed FESCOM method were 1.7%, 8.9%, and 7.4%, respectively, and the feature selection time in each iteration was 13 seconds, 71 seconds, and 541 seconds, respectively. CONCLUSIONS: The experimental results demonstrated that, in the case of 5 motion types performed by 21 healthy participants, the proposed FESCOM method using kNN and NB as the wrapping components achieved better recognition performance than the traditional wrapper method. The FESCOM method dramatically reduces the search time in the feature selection process. The results also demonstrated that the optimal number of features depends on the classifier. This approach serves to improve feature selection and classification algorithm selection for upper-limb motion recognition based on wearable sensor data, which can be extended to motion recognition of more motion types and participants. JMIR Publications 2021-09-02 /pmc/articles/PMC8446846/ /pubmed/34473067 http://dx.doi.org/10.2196/24402 Text en ©Qiaoqin Li, Yongguo Liu, Jiajing Zhu, Zhi Chen, Lang Liu, Shangming Yang, Guanyi Zhu, Bin Zhu, Juan Li, Rongjiang Jin, Jing Tao, Lidian Chen. Originally published in JMIR mHealth and uHealth (https://mhealth.jmir.org), 02.09.2021. https://creativecommons.org/licenses/by/4.0/This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in JMIR mHealth and uHealth, is properly cited. The complete bibliographic information, a link to the original publication on https://mhealth.jmir.org/, as well as this copyright and license information must be included. |
spellingShingle | Original Paper Li, Qiaoqin Liu, Yongguo Zhu, Jiajing Chen, Zhi Liu, Lang Yang, Shangming Zhu, Guanyi Zhu, Bin Li, Juan Jin, Rongjiang Tao, Jing Chen, Lidian Upper-Limb Motion Recognition Based on Hybrid Feature Selection: Algorithm Development and Validation |
title | Upper-Limb Motion Recognition Based on Hybrid Feature Selection: Algorithm Development and Validation |
title_full | Upper-Limb Motion Recognition Based on Hybrid Feature Selection: Algorithm Development and Validation |
title_fullStr | Upper-Limb Motion Recognition Based on Hybrid Feature Selection: Algorithm Development and Validation |
title_full_unstemmed | Upper-Limb Motion Recognition Based on Hybrid Feature Selection: Algorithm Development and Validation |
title_short | Upper-Limb Motion Recognition Based on Hybrid Feature Selection: Algorithm Development and Validation |
title_sort | upper-limb motion recognition based on hybrid feature selection: algorithm development and validation |
topic | Original Paper |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8446846/ https://www.ncbi.nlm.nih.gov/pubmed/34473067 http://dx.doi.org/10.2196/24402 |
work_keys_str_mv | AT liqiaoqin upperlimbmotionrecognitionbasedonhybridfeatureselectionalgorithmdevelopmentandvalidation AT liuyongguo upperlimbmotionrecognitionbasedonhybridfeatureselectionalgorithmdevelopmentandvalidation AT zhujiajing upperlimbmotionrecognitionbasedonhybridfeatureselectionalgorithmdevelopmentandvalidation AT chenzhi upperlimbmotionrecognitionbasedonhybridfeatureselectionalgorithmdevelopmentandvalidation AT liulang upperlimbmotionrecognitionbasedonhybridfeatureselectionalgorithmdevelopmentandvalidation AT yangshangming upperlimbmotionrecognitionbasedonhybridfeatureselectionalgorithmdevelopmentandvalidation AT zhuguanyi upperlimbmotionrecognitionbasedonhybridfeatureselectionalgorithmdevelopmentandvalidation AT zhubin upperlimbmotionrecognitionbasedonhybridfeatureselectionalgorithmdevelopmentandvalidation AT lijuan upperlimbmotionrecognitionbasedonhybridfeatureselectionalgorithmdevelopmentandvalidation AT jinrongjiang upperlimbmotionrecognitionbasedonhybridfeatureselectionalgorithmdevelopmentandvalidation AT taojing upperlimbmotionrecognitionbasedonhybridfeatureselectionalgorithmdevelopmentandvalidation AT chenlidian upperlimbmotionrecognitionbasedonhybridfeatureselectionalgorithmdevelopmentandvalidation |