Cargando…
fNIRS-Based Upper Limb Motion Intention Recognition Using an Artificial Neural Network for Transhumeral Amputees
Prosthetic arms are designed to assist amputated individuals in the performance of the activities of daily life. Brain machine interfaces are currently employed to enhance the accuracy as well as number of control commands for upper limb prostheses. However, the motion prediction for prosthetic arms...
Autores principales: | , , , , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
MDPI
2022
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8837999/ https://www.ncbi.nlm.nih.gov/pubmed/35161473 http://dx.doi.org/10.3390/s22030726 |
_version_ | 1784650017586282496 |
---|---|
author | Sattar, Neelum Yousaf Kausar, Zareena Usama, Syed Ali Farooq, Umer Shah, Muhammad Faizan Muhammad, Shaheer Khan, Razaullah Badran, Mohamed |
author_facet | Sattar, Neelum Yousaf Kausar, Zareena Usama, Syed Ali Farooq, Umer Shah, Muhammad Faizan Muhammad, Shaheer Khan, Razaullah Badran, Mohamed |
author_sort | Sattar, Neelum Yousaf |
collection | PubMed |
description | Prosthetic arms are designed to assist amputated individuals in the performance of the activities of daily life. Brain machine interfaces are currently employed to enhance the accuracy as well as number of control commands for upper limb prostheses. However, the motion prediction for prosthetic arms and the rehabilitation of amputees suffering from transhumeral amputations is limited. In this paper, functional near-infrared spectroscopy (fNIRS)-based approach for the recognition of human intention for six upper limb motions is proposed. The data were extracted from the study of fifteen healthy subjects and three transhumeral amputees for elbow extension, elbow flexion, wrist pronation, wrist supination, hand open, and hand close. The fNIRS signals were acquired from the motor cortex region of the brain by the commercial NIRSport device. The acquired data samples were filtered using finite impulse response (FIR) filter. Furthermore, signal mean, signal peak and minimum values were computed as feature set. An artificial neural network (ANN) was applied to these data samples. The results show the likelihood of classifying the six arm actions with an accuracy of 78%. The attained results have not yet been reported in any identical study. These achieved fNIRS results for intention detection are promising and suggest that they can be applied for the real-time control of the transhumeral prosthesis. |
format | Online Article Text |
id | pubmed-8837999 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2022 |
publisher | MDPI |
record_format | MEDLINE/PubMed |
spelling | pubmed-88379992022-02-13 fNIRS-Based Upper Limb Motion Intention Recognition Using an Artificial Neural Network for Transhumeral Amputees Sattar, Neelum Yousaf Kausar, Zareena Usama, Syed Ali Farooq, Umer Shah, Muhammad Faizan Muhammad, Shaheer Khan, Razaullah Badran, Mohamed Sensors (Basel) Article Prosthetic arms are designed to assist amputated individuals in the performance of the activities of daily life. Brain machine interfaces are currently employed to enhance the accuracy as well as number of control commands for upper limb prostheses. However, the motion prediction for prosthetic arms and the rehabilitation of amputees suffering from transhumeral amputations is limited. In this paper, functional near-infrared spectroscopy (fNIRS)-based approach for the recognition of human intention for six upper limb motions is proposed. The data were extracted from the study of fifteen healthy subjects and three transhumeral amputees for elbow extension, elbow flexion, wrist pronation, wrist supination, hand open, and hand close. The fNIRS signals were acquired from the motor cortex region of the brain by the commercial NIRSport device. The acquired data samples were filtered using finite impulse response (FIR) filter. Furthermore, signal mean, signal peak and minimum values were computed as feature set. An artificial neural network (ANN) was applied to these data samples. The results show the likelihood of classifying the six arm actions with an accuracy of 78%. The attained results have not yet been reported in any identical study. These achieved fNIRS results for intention detection are promising and suggest that they can be applied for the real-time control of the transhumeral prosthesis. MDPI 2022-01-18 /pmc/articles/PMC8837999/ /pubmed/35161473 http://dx.doi.org/10.3390/s22030726 Text en © 2022 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). |
spellingShingle | Article Sattar, Neelum Yousaf Kausar, Zareena Usama, Syed Ali Farooq, Umer Shah, Muhammad Faizan Muhammad, Shaheer Khan, Razaullah Badran, Mohamed fNIRS-Based Upper Limb Motion Intention Recognition Using an Artificial Neural Network for Transhumeral Amputees |
title | fNIRS-Based Upper Limb Motion Intention Recognition Using an Artificial Neural Network for Transhumeral Amputees |
title_full | fNIRS-Based Upper Limb Motion Intention Recognition Using an Artificial Neural Network for Transhumeral Amputees |
title_fullStr | fNIRS-Based Upper Limb Motion Intention Recognition Using an Artificial Neural Network for Transhumeral Amputees |
title_full_unstemmed | fNIRS-Based Upper Limb Motion Intention Recognition Using an Artificial Neural Network for Transhumeral Amputees |
title_short | fNIRS-Based Upper Limb Motion Intention Recognition Using an Artificial Neural Network for Transhumeral Amputees |
title_sort | fnirs-based upper limb motion intention recognition using an artificial neural network for transhumeral amputees |
topic | Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8837999/ https://www.ncbi.nlm.nih.gov/pubmed/35161473 http://dx.doi.org/10.3390/s22030726 |
work_keys_str_mv | AT sattarneelumyousaf fnirsbasedupperlimbmotionintentionrecognitionusinganartificialneuralnetworkfortranshumeralamputees AT kausarzareena fnirsbasedupperlimbmotionintentionrecognitionusinganartificialneuralnetworkfortranshumeralamputees AT usamasyedali fnirsbasedupperlimbmotionintentionrecognitionusinganartificialneuralnetworkfortranshumeralamputees AT farooqumer fnirsbasedupperlimbmotionintentionrecognitionusinganartificialneuralnetworkfortranshumeralamputees AT shahmuhammadfaizan fnirsbasedupperlimbmotionintentionrecognitionusinganartificialneuralnetworkfortranshumeralamputees AT muhammadshaheer fnirsbasedupperlimbmotionintentionrecognitionusinganartificialneuralnetworkfortranshumeralamputees AT khanrazaullah fnirsbasedupperlimbmotionintentionrecognitionusinganartificialneuralnetworkfortranshumeralamputees AT badranmohamed fnirsbasedupperlimbmotionintentionrecognitionusinganartificialneuralnetworkfortranshumeralamputees |