Cargando…
Locomo-Net: A Low -Complex Deep Learning Framework for sEMG-Based Hand Movement Recognition for Prosthetic Control
Background: The enhancement in the performance of the myoelectric pattern recognition techniques based on deep learning algorithm possess computationally expensive and exhibit extensive memory behavior. Therefore, in this paper we report a deep learning framework named ‘Low-Complex Movement recognit...
Formato: | Online Artículo Texto |
---|---|
Lenguaje: | English |
Publicado: |
IEEE
2020
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7529116/ https://www.ncbi.nlm.nih.gov/pubmed/33014638 http://dx.doi.org/10.1109/JTEHM.2020.3023898 |
_version_ | 1783589369776439296 |
---|---|
collection | PubMed |
description | Background: The enhancement in the performance of the myoelectric pattern recognition techniques based on deep learning algorithm possess computationally expensive and exhibit extensive memory behavior. Therefore, in this paper we report a deep learning framework named ‘Low-Complex Movement recognition-Net’ (LoCoMo-Net) built with convolution neural network (CNN) for recognition of wrist and finger flexion movements; grasping and functional movements; and force pattern from single channel surface electromyography (sEMG) recording. The network consists of a two-stage pipeline: 1) input data compression; 2) data-driven weight sharing. Methods: The proposed framework was validated on two different datasets- our own dataset (DS1) and publicly available NinaPro dataset (DS2) for 16 movements and 50 movements respectively. Further, we have prototyped the proposed LoCoMo-Net on Virtex-7 Xilinx field-programmable gate array (FPGA) platform and validated for 15 movements from DS1 to demonstrate its feasibility for real-time execution. Results: The effectiveness of the proposed LoCoMo-Net was verified by a comparative analysis against the benchmarked models using the same datasets wherein our proposed model outperformed Twin- Support Vector Machine (SVM) and existing CNN based model by an average classification accuracy of 8.5 % and 16.0 % respectively. In addition, hardware complexity analysis is done to reveal the advantages of the two-stage pipeline where approximately 27 %, 49 %, 50 %, 23 %, and 43 % savings achieved in lookup tables (LUT’s), registers, memory, power consumption and computational time respectively. Conclusion: The clinical significance of such sEMG based accurate and low-complex movement recognition system can be favorable for the potential improvement in quality of life of an amputated persons. |
format | Online Article Text |
id | pubmed-7529116 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2020 |
publisher | IEEE |
record_format | MEDLINE/PubMed |
spelling | pubmed-75291162020-10-02 Locomo-Net: A Low -Complex Deep Learning Framework for sEMG-Based Hand Movement Recognition for Prosthetic Control IEEE J Transl Eng Health Med Article Background: The enhancement in the performance of the myoelectric pattern recognition techniques based on deep learning algorithm possess computationally expensive and exhibit extensive memory behavior. Therefore, in this paper we report a deep learning framework named ‘Low-Complex Movement recognition-Net’ (LoCoMo-Net) built with convolution neural network (CNN) for recognition of wrist and finger flexion movements; grasping and functional movements; and force pattern from single channel surface electromyography (sEMG) recording. The network consists of a two-stage pipeline: 1) input data compression; 2) data-driven weight sharing. Methods: The proposed framework was validated on two different datasets- our own dataset (DS1) and publicly available NinaPro dataset (DS2) for 16 movements and 50 movements respectively. Further, we have prototyped the proposed LoCoMo-Net on Virtex-7 Xilinx field-programmable gate array (FPGA) platform and validated for 15 movements from DS1 to demonstrate its feasibility for real-time execution. Results: The effectiveness of the proposed LoCoMo-Net was verified by a comparative analysis against the benchmarked models using the same datasets wherein our proposed model outperformed Twin- Support Vector Machine (SVM) and existing CNN based model by an average classification accuracy of 8.5 % and 16.0 % respectively. In addition, hardware complexity analysis is done to reveal the advantages of the two-stage pipeline where approximately 27 %, 49 %, 50 %, 23 %, and 43 % savings achieved in lookup tables (LUT’s), registers, memory, power consumption and computational time respectively. Conclusion: The clinical significance of such sEMG based accurate and low-complex movement recognition system can be favorable for the potential improvement in quality of life of an amputated persons. IEEE 2020-09-15 /pmc/articles/PMC7529116/ /pubmed/33014638 http://dx.doi.org/10.1109/JTEHM.2020.3023898 Text en https://creativecommons.org/licenses/by/4.0/ This work is licensed under a Creative Commons Attribution 4.0 License. For more information, see https://creativecommons.org/licenses/by/4.0/ |
spellingShingle | Article Locomo-Net: A Low -Complex Deep Learning Framework for sEMG-Based Hand Movement Recognition for Prosthetic Control |
title | Locomo-Net: A Low -Complex Deep Learning Framework for sEMG-Based Hand Movement Recognition for Prosthetic Control |
title_full | Locomo-Net: A Low -Complex Deep Learning Framework for sEMG-Based Hand Movement Recognition for Prosthetic Control |
title_fullStr | Locomo-Net: A Low -Complex Deep Learning Framework for sEMG-Based Hand Movement Recognition for Prosthetic Control |
title_full_unstemmed | Locomo-Net: A Low -Complex Deep Learning Framework for sEMG-Based Hand Movement Recognition for Prosthetic Control |
title_short | Locomo-Net: A Low -Complex Deep Learning Framework for sEMG-Based Hand Movement Recognition for Prosthetic Control |
title_sort | locomo-net: a low -complex deep learning framework for semg-based hand movement recognition for prosthetic control |
topic | Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7529116/ https://www.ncbi.nlm.nih.gov/pubmed/33014638 http://dx.doi.org/10.1109/JTEHM.2020.3023898 |
work_keys_str_mv | AT locomonetalowcomplexdeeplearningframeworkforsemgbasedhandmovementrecognitionforprostheticcontrol AT locomonetalowcomplexdeeplearningframeworkforsemgbasedhandmovementrecognitionforprostheticcontrol AT locomonetalowcomplexdeeplearningframeworkforsemgbasedhandmovementrecognitionforprostheticcontrol AT locomonetalowcomplexdeeplearningframeworkforsemgbasedhandmovementrecognitionforprostheticcontrol AT locomonetalowcomplexdeeplearningframeworkforsemgbasedhandmovementrecognitionforprostheticcontrol AT locomonetalowcomplexdeeplearningframeworkforsemgbasedhandmovementrecognitionforprostheticcontrol AT locomonetalowcomplexdeeplearningframeworkforsemgbasedhandmovementrecognitionforprostheticcontrol |