Cargando…

Human Lower Limb Motion Capture and Recognition Based on Smartphones

Human motion recognition based on wearable devices plays a vital role in pervasive computing. Smartphones have built-in motion sensors that measure the motion of the device with high precision. In this paper, we propose a human lower limb motion capture and recognition approach based on a Smartphone...

Descripción completa

Detalles Bibliográficos
Autores principales: Duan, Lin-Tao, Lawo, Michael, Wang, Zhi-Guo, Wang, Hai-Ying
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9319117/
https://www.ncbi.nlm.nih.gov/pubmed/35890952
http://dx.doi.org/10.3390/s22145273
Descripción
Sumario:Human motion recognition based on wearable devices plays a vital role in pervasive computing. Smartphones have built-in motion sensors that measure the motion of the device with high precision. In this paper, we propose a human lower limb motion capture and recognition approach based on a Smartphone. We design a motion logger to record five categories of limb activities (standing up, sitting down, walking, going upstairs, and going downstairs) using two motion sensors (tri-axial accelerometer, tri-axial gyroscope). We extract the motion features and select a subset of features as a feature vector from the frequency domain of the sensing data using Fast Fourier Transform (FFT). We classify and predict human lower limb motion using three supervised learning algorithms: Naïve Bayes (NB), K-Nearest Neighbor (KNN), and Artificial Neural Networks (ANNs). We use 670 lower limb motion samples to train and verify these classifiers using the 10-folder cross-validation technique. Finally, we design and implement a live detection system to validate our motion detection approach. The experimental results show that our low-cost approach can recognize human lower limb activities with acceptable accuracy. On average, the recognition rate of NB, KNN, and ANNs are 97.01%, 96.12%, and 98.21%, respectively.