Cargando…

Human facial neural activities and gesture recognition for machine-interfacing applications

The authors present a new method of recognizing different human facial gestures through their neural activities and muscle movements, which can be used in machine-interfacing applications. Human–machine interface (HMI) technology utilizes human neural activities as input controllers for the machine....

Descripción completa

Detalles Bibliográficos
Autores principales: Hamedi, M, Salleh, Sh-Hussain, Tan, TS, Ismail, K, Ali, J, Dee-Uam, C, Pavaganun, C, Yupapin, PP
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Dove Medical Press 2011
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3260039/
https://www.ncbi.nlm.nih.gov/pubmed/22267930
http://dx.doi.org/10.2147/IJN.S26619
_version_ 1782221429786279936
author Hamedi, M
Salleh, Sh-Hussain
Tan, TS
Ismail, K
Ali, J
Dee-Uam, C
Pavaganun, C
Yupapin, PP
author_facet Hamedi, M
Salleh, Sh-Hussain
Tan, TS
Ismail, K
Ali, J
Dee-Uam, C
Pavaganun, C
Yupapin, PP
author_sort Hamedi, M
collection PubMed
description The authors present a new method of recognizing different human facial gestures through their neural activities and muscle movements, which can be used in machine-interfacing applications. Human–machine interface (HMI) technology utilizes human neural activities as input controllers for the machine. Recently, much work has been done on the specific application of facial electromyography (EMG)-based HMI, which have used limited and fixed numbers of facial gestures. In this work, a multipurpose interface is suggested that can support 2–11 control commands that can be applied to various HMI systems. The significance of this work is finding the most accurate facial gestures for any application with a maximum of eleven control commands. Eleven facial gesture EMGs are recorded from ten volunteers. Detected EMGs are passed through a band-pass filter and root mean square features are extracted. Various combinations of gestures with a different number of gestures in each group are made from the existing facial gestures. Finally, all combinations are trained and classified by a Fuzzy c-means classifier. In conclusion, combinations with the highest recognition accuracy in each group are chosen. An average accuracy >90% of chosen combinations proved their ability to be used as command controllers.
format Online
Article
Text
id pubmed-3260039
institution National Center for Biotechnology Information
language English
publishDate 2011
publisher Dove Medical Press
record_format MEDLINE/PubMed
spelling pubmed-32600392012-01-20 Human facial neural activities and gesture recognition for machine-interfacing applications Hamedi, M Salleh, Sh-Hussain Tan, TS Ismail, K Ali, J Dee-Uam, C Pavaganun, C Yupapin, PP Int J Nanomedicine Original Research The authors present a new method of recognizing different human facial gestures through their neural activities and muscle movements, which can be used in machine-interfacing applications. Human–machine interface (HMI) technology utilizes human neural activities as input controllers for the machine. Recently, much work has been done on the specific application of facial electromyography (EMG)-based HMI, which have used limited and fixed numbers of facial gestures. In this work, a multipurpose interface is suggested that can support 2–11 control commands that can be applied to various HMI systems. The significance of this work is finding the most accurate facial gestures for any application with a maximum of eleven control commands. Eleven facial gesture EMGs are recorded from ten volunteers. Detected EMGs are passed through a band-pass filter and root mean square features are extracted. Various combinations of gestures with a different number of gestures in each group are made from the existing facial gestures. Finally, all combinations are trained and classified by a Fuzzy c-means classifier. In conclusion, combinations with the highest recognition accuracy in each group are chosen. An average accuracy >90% of chosen combinations proved their ability to be used as command controllers. Dove Medical Press 2011 2011-12-16 /pmc/articles/PMC3260039/ /pubmed/22267930 http://dx.doi.org/10.2147/IJN.S26619 Text en © 2011 Hamedi et al, publisher and licensee Dove Medical Press Ltd. This is an Open Access article which permits unrestricted noncommercial use, provided the original work is properly cited.
spellingShingle Original Research
Hamedi, M
Salleh, Sh-Hussain
Tan, TS
Ismail, K
Ali, J
Dee-Uam, C
Pavaganun, C
Yupapin, PP
Human facial neural activities and gesture recognition for machine-interfacing applications
title Human facial neural activities and gesture recognition for machine-interfacing applications
title_full Human facial neural activities and gesture recognition for machine-interfacing applications
title_fullStr Human facial neural activities and gesture recognition for machine-interfacing applications
title_full_unstemmed Human facial neural activities and gesture recognition for machine-interfacing applications
title_short Human facial neural activities and gesture recognition for machine-interfacing applications
title_sort human facial neural activities and gesture recognition for machine-interfacing applications
topic Original Research
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3260039/
https://www.ncbi.nlm.nih.gov/pubmed/22267930
http://dx.doi.org/10.2147/IJN.S26619
work_keys_str_mv AT hamedim humanfacialneuralactivitiesandgesturerecognitionformachineinterfacingapplications
AT sallehshhussain humanfacialneuralactivitiesandgesturerecognitionformachineinterfacingapplications
AT tants humanfacialneuralactivitiesandgesturerecognitionformachineinterfacingapplications
AT ismailk humanfacialneuralactivitiesandgesturerecognitionformachineinterfacingapplications
AT alij humanfacialneuralactivitiesandgesturerecognitionformachineinterfacingapplications
AT deeuamc humanfacialneuralactivitiesandgesturerecognitionformachineinterfacingapplications
AT pavaganunc humanfacialneuralactivitiesandgesturerecognitionformachineinterfacingapplications
AT yupapinpp humanfacialneuralactivitiesandgesturerecognitionformachineinterfacingapplications