Cargando…

An sEMG-Based Human-Exoskeleton Interface Fusing Convolutional Neural Networks With Hand-Crafted Features

In recent years, the human-robot interfaces (HRIs) based on surface electromyography (sEMG) have been widely used in lower-limb exoskeleton robots for movement prediction during rehabilitation training for patients with hemiplegia. However, accurate and efficient lower-limb movement prediction for p...

Descripción completa

Detalles Bibliográficos
Autores principales: Yang, Xiao, Fu, Zhe, Li, Bing, Liu, Jun
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Frontiers Media S.A. 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9284005/
https://www.ncbi.nlm.nih.gov/pubmed/35845758
http://dx.doi.org/10.3389/fnbot.2022.938345
_version_ 1784747458977333248
author Yang, Xiao
Fu, Zhe
Li, Bing
Liu, Jun
author_facet Yang, Xiao
Fu, Zhe
Li, Bing
Liu, Jun
author_sort Yang, Xiao
collection PubMed
description In recent years, the human-robot interfaces (HRIs) based on surface electromyography (sEMG) have been widely used in lower-limb exoskeleton robots for movement prediction during rehabilitation training for patients with hemiplegia. However, accurate and efficient lower-limb movement prediction for patients with hemiplegia remains a challenge due to complex movement information and individual differences. Traditional movement prediction methods usually use hand-crafted features, which are computationally cheap but can only extract some shallow heuristic information. Deep learning-based methods have a stronger feature expression ability, but it is easy to fall into the dilemma of local features, resulting in poor generalization performance of the method. In this article, a human-exoskeleton interface fusing convolutional neural networks with hand-crafted features is proposed. On the basis of our previous study, a lower-limb movement prediction framework (HCSNet) in patients with hemiplegia is constructed by fusing time and frequency domain hand-crafted features and channel synergy learning-based features. An sEMG data acquisition experiment is designed to compare and analyze the effectiveness of HCSNet. Experimental results show that the method can achieve 95.93 and 90.37% prediction accuracy in both within-subject and cross-subject cases, respectively. Compared with related lower-limb movement prediction methods, the proposed method has better prediction performance.
format Online
Article
Text
id pubmed-9284005
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher Frontiers Media S.A.
record_format MEDLINE/PubMed
spelling pubmed-92840052022-07-16 An sEMG-Based Human-Exoskeleton Interface Fusing Convolutional Neural Networks With Hand-Crafted Features Yang, Xiao Fu, Zhe Li, Bing Liu, Jun Front Neurorobot Neuroscience In recent years, the human-robot interfaces (HRIs) based on surface electromyography (sEMG) have been widely used in lower-limb exoskeleton robots for movement prediction during rehabilitation training for patients with hemiplegia. However, accurate and efficient lower-limb movement prediction for patients with hemiplegia remains a challenge due to complex movement information and individual differences. Traditional movement prediction methods usually use hand-crafted features, which are computationally cheap but can only extract some shallow heuristic information. Deep learning-based methods have a stronger feature expression ability, but it is easy to fall into the dilemma of local features, resulting in poor generalization performance of the method. In this article, a human-exoskeleton interface fusing convolutional neural networks with hand-crafted features is proposed. On the basis of our previous study, a lower-limb movement prediction framework (HCSNet) in patients with hemiplegia is constructed by fusing time and frequency domain hand-crafted features and channel synergy learning-based features. An sEMG data acquisition experiment is designed to compare and analyze the effectiveness of HCSNet. Experimental results show that the method can achieve 95.93 and 90.37% prediction accuracy in both within-subject and cross-subject cases, respectively. Compared with related lower-limb movement prediction methods, the proposed method has better prediction performance. Frontiers Media S.A. 2022-07-01 /pmc/articles/PMC9284005/ /pubmed/35845758 http://dx.doi.org/10.3389/fnbot.2022.938345 Text en Copyright © 2022 Yang, Fu, Li and Liu. https://creativecommons.org/licenses/by/4.0/This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
spellingShingle Neuroscience
Yang, Xiao
Fu, Zhe
Li, Bing
Liu, Jun
An sEMG-Based Human-Exoskeleton Interface Fusing Convolutional Neural Networks With Hand-Crafted Features
title An sEMG-Based Human-Exoskeleton Interface Fusing Convolutional Neural Networks With Hand-Crafted Features
title_full An sEMG-Based Human-Exoskeleton Interface Fusing Convolutional Neural Networks With Hand-Crafted Features
title_fullStr An sEMG-Based Human-Exoskeleton Interface Fusing Convolutional Neural Networks With Hand-Crafted Features
title_full_unstemmed An sEMG-Based Human-Exoskeleton Interface Fusing Convolutional Neural Networks With Hand-Crafted Features
title_short An sEMG-Based Human-Exoskeleton Interface Fusing Convolutional Neural Networks With Hand-Crafted Features
title_sort semg-based human-exoskeleton interface fusing convolutional neural networks with hand-crafted features
topic Neuroscience
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9284005/
https://www.ncbi.nlm.nih.gov/pubmed/35845758
http://dx.doi.org/10.3389/fnbot.2022.938345
work_keys_str_mv AT yangxiao ansemgbasedhumanexoskeletoninterfacefusingconvolutionalneuralnetworkswithhandcraftedfeatures
AT fuzhe ansemgbasedhumanexoskeletoninterfacefusingconvolutionalneuralnetworkswithhandcraftedfeatures
AT libing ansemgbasedhumanexoskeletoninterfacefusingconvolutionalneuralnetworkswithhandcraftedfeatures
AT liujun ansemgbasedhumanexoskeletoninterfacefusingconvolutionalneuralnetworkswithhandcraftedfeatures
AT yangxiao semgbasedhumanexoskeletoninterfacefusingconvolutionalneuralnetworkswithhandcraftedfeatures
AT fuzhe semgbasedhumanexoskeletoninterfacefusingconvolutionalneuralnetworkswithhandcraftedfeatures
AT libing semgbasedhumanexoskeletoninterfacefusingconvolutionalneuralnetworkswithhandcraftedfeatures
AT liujun semgbasedhumanexoskeletoninterfacefusingconvolutionalneuralnetworkswithhandcraftedfeatures