Cargando…

Toward More Robust Hand Gesture Recognition on EIT Data

Striving for more robust and natural control of multi-fingered hand prostheses, we are studying electrical impedance tomography (EIT) as a method to monitor residual muscle activations. Previous work has shown promising results for hand gesture recognition, but also lacks generalization across multi...

Descripción completa

Detalles Bibliográficos
Autores principales: Leins, David P., Gibas, Christian, Brück, Rainer, Haschke, Robert
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Frontiers Media S.A. 2021
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8385652/
https://www.ncbi.nlm.nih.gov/pubmed/34456704
http://dx.doi.org/10.3389/fnbot.2021.659311
_version_ 1783742133047394304
author Leins, David P.
Gibas, Christian
Brück, Rainer
Haschke, Robert
author_facet Leins, David P.
Gibas, Christian
Brück, Rainer
Haschke, Robert
author_sort Leins, David P.
collection PubMed
description Striving for more robust and natural control of multi-fingered hand prostheses, we are studying electrical impedance tomography (EIT) as a method to monitor residual muscle activations. Previous work has shown promising results for hand gesture recognition, but also lacks generalization across multiple sessions and users. Thus, the present paper aims for a detailed analysis of an existing EIT dataset acquired with a 16-electrode wrist band as a prerequisite for further improvements of machine learning results on this type of signal. The performed t-SNE analysis confirms a much stronger inter-session and inter-user variance compared to the expected in-class variance. Additionally, we observe a strong drift of signals within a session. To handle these challenging problems, we propose new machine learning architectures based on deep learning, which allow to separate undesired from desired variation and thus significantly improve the classification accuracy. With these new architectures we increased cross-session classification accuracy on 12 gestures from 19.55 to 30.45%. Based on a fundamental data analysis we developed three calibration methods and thus were able to further increase cross-session classification accuracy to 39.01, 55.37, and 56.34%, respectively.
format Online
Article
Text
id pubmed-8385652
institution National Center for Biotechnology Information
language English
publishDate 2021
publisher Frontiers Media S.A.
record_format MEDLINE/PubMed
spelling pubmed-83856522021-08-26 Toward More Robust Hand Gesture Recognition on EIT Data Leins, David P. Gibas, Christian Brück, Rainer Haschke, Robert Front Neurorobot Neuroscience Striving for more robust and natural control of multi-fingered hand prostheses, we are studying electrical impedance tomography (EIT) as a method to monitor residual muscle activations. Previous work has shown promising results for hand gesture recognition, but also lacks generalization across multiple sessions and users. Thus, the present paper aims for a detailed analysis of an existing EIT dataset acquired with a 16-electrode wrist band as a prerequisite for further improvements of machine learning results on this type of signal. The performed t-SNE analysis confirms a much stronger inter-session and inter-user variance compared to the expected in-class variance. Additionally, we observe a strong drift of signals within a session. To handle these challenging problems, we propose new machine learning architectures based on deep learning, which allow to separate undesired from desired variation and thus significantly improve the classification accuracy. With these new architectures we increased cross-session classification accuracy on 12 gestures from 19.55 to 30.45%. Based on a fundamental data analysis we developed three calibration methods and thus were able to further increase cross-session classification accuracy to 39.01, 55.37, and 56.34%, respectively. Frontiers Media S.A. 2021-08-11 /pmc/articles/PMC8385652/ /pubmed/34456704 http://dx.doi.org/10.3389/fnbot.2021.659311 Text en Copyright © 2021 Leins, Gibas, Brück and Haschke. https://creativecommons.org/licenses/by/4.0/This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
spellingShingle Neuroscience
Leins, David P.
Gibas, Christian
Brück, Rainer
Haschke, Robert
Toward More Robust Hand Gesture Recognition on EIT Data
title Toward More Robust Hand Gesture Recognition on EIT Data
title_full Toward More Robust Hand Gesture Recognition on EIT Data
title_fullStr Toward More Robust Hand Gesture Recognition on EIT Data
title_full_unstemmed Toward More Robust Hand Gesture Recognition on EIT Data
title_short Toward More Robust Hand Gesture Recognition on EIT Data
title_sort toward more robust hand gesture recognition on eit data
topic Neuroscience
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8385652/
https://www.ncbi.nlm.nih.gov/pubmed/34456704
http://dx.doi.org/10.3389/fnbot.2021.659311
work_keys_str_mv AT leinsdavidp towardmorerobusthandgesturerecognitiononeitdata
AT gibaschristian towardmorerobusthandgesturerecognitiononeitdata
AT bruckrainer towardmorerobusthandgesturerecognitiononeitdata
AT haschkerobert towardmorerobusthandgesturerecognitiononeitdata