Cargando…

Simulation for a Mems-Based CTRNN Ultra-Low Power Implementation of Human Activity Recognition

This paper presents an energy-efficient classification framework that performs human activity recognition (HAR). Typically, HAR classification tasks require a computational platform that includes a processor and memory along with sensors and their interfaces, all of which consume significant power....

Descripción completa

Detalles Bibliográficos
Autores principales: Emad-Ud-Din, Muhammad, Hasan, Mohammad H., Jafari, Roozbeh, Pourkamali, Siavash, Alsaleem, Fadi
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Frontiers Media S.A. 2021
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8522023/
https://www.ncbi.nlm.nih.gov/pubmed/34713201
http://dx.doi.org/10.3389/fdgth.2021.731076
_version_ 1784585011042713600
author Emad-Ud-Din, Muhammad
Hasan, Mohammad H.
Jafari, Roozbeh
Pourkamali, Siavash
Alsaleem, Fadi
author_facet Emad-Ud-Din, Muhammad
Hasan, Mohammad H.
Jafari, Roozbeh
Pourkamali, Siavash
Alsaleem, Fadi
author_sort Emad-Ud-Din, Muhammad
collection PubMed
description This paper presents an energy-efficient classification framework that performs human activity recognition (HAR). Typically, HAR classification tasks require a computational platform that includes a processor and memory along with sensors and their interfaces, all of which consume significant power. The presented framework employs microelectromechanical systems (MEMS) based Continuous Time Recurrent Neural Network (CTRNN) to perform HAR tasks very efficiently. In a real physical implementation, we show that the MEMS-CTRNN nodes can perform computing while consuming power on a nano-watts scale compared to the micro-watts state-of-the-art hardware. We also confirm that this huge power reduction doesn't come at the expense of reduced performance by evaluating its accuracy to classify the highly cited human activity recognition dataset (HAPT). Our simulation results show that the HAR framework that consists of a training module, and a network of MEMS-based CTRNN nodes, provides HAR classification accuracy for the HAPT that is comparable to traditional CTRNN and other Recurrent Neural Network (RNN) implantations. For example, we show that the MEMS-based CTRNN model average accuracy for the worst-case scenario of not using pre-processing techniques, such as quantization, to classify 5 different activities is 77.94% compared to 78.48% using the traditional CTRNN.
format Online
Article
Text
id pubmed-8522023
institution National Center for Biotechnology Information
language English
publishDate 2021
publisher Frontiers Media S.A.
record_format MEDLINE/PubMed
spelling pubmed-85220232021-10-27 Simulation for a Mems-Based CTRNN Ultra-Low Power Implementation of Human Activity Recognition Emad-Ud-Din, Muhammad Hasan, Mohammad H. Jafari, Roozbeh Pourkamali, Siavash Alsaleem, Fadi Front Digit Health Digital Health This paper presents an energy-efficient classification framework that performs human activity recognition (HAR). Typically, HAR classification tasks require a computational platform that includes a processor and memory along with sensors and their interfaces, all of which consume significant power. The presented framework employs microelectromechanical systems (MEMS) based Continuous Time Recurrent Neural Network (CTRNN) to perform HAR tasks very efficiently. In a real physical implementation, we show that the MEMS-CTRNN nodes can perform computing while consuming power on a nano-watts scale compared to the micro-watts state-of-the-art hardware. We also confirm that this huge power reduction doesn't come at the expense of reduced performance by evaluating its accuracy to classify the highly cited human activity recognition dataset (HAPT). Our simulation results show that the HAR framework that consists of a training module, and a network of MEMS-based CTRNN nodes, provides HAR classification accuracy for the HAPT that is comparable to traditional CTRNN and other Recurrent Neural Network (RNN) implantations. For example, we show that the MEMS-based CTRNN model average accuracy for the worst-case scenario of not using pre-processing techniques, such as quantization, to classify 5 different activities is 77.94% compared to 78.48% using the traditional CTRNN. Frontiers Media S.A. 2021-09-22 /pmc/articles/PMC8522023/ /pubmed/34713201 http://dx.doi.org/10.3389/fdgth.2021.731076 Text en Copyright © 2021 Emad-Ud-Din, Hasan, Jafari, Pourkamali and Alsaleem. https://creativecommons.org/licenses/by/4.0/This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
spellingShingle Digital Health
Emad-Ud-Din, Muhammad
Hasan, Mohammad H.
Jafari, Roozbeh
Pourkamali, Siavash
Alsaleem, Fadi
Simulation for a Mems-Based CTRNN Ultra-Low Power Implementation of Human Activity Recognition
title Simulation for a Mems-Based CTRNN Ultra-Low Power Implementation of Human Activity Recognition
title_full Simulation for a Mems-Based CTRNN Ultra-Low Power Implementation of Human Activity Recognition
title_fullStr Simulation for a Mems-Based CTRNN Ultra-Low Power Implementation of Human Activity Recognition
title_full_unstemmed Simulation for a Mems-Based CTRNN Ultra-Low Power Implementation of Human Activity Recognition
title_short Simulation for a Mems-Based CTRNN Ultra-Low Power Implementation of Human Activity Recognition
title_sort simulation for a mems-based ctrnn ultra-low power implementation of human activity recognition
topic Digital Health
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8522023/
https://www.ncbi.nlm.nih.gov/pubmed/34713201
http://dx.doi.org/10.3389/fdgth.2021.731076
work_keys_str_mv AT emaduddinmuhammad simulationforamemsbasedctrnnultralowpowerimplementationofhumanactivityrecognition
AT hasanmohammadh simulationforamemsbasedctrnnultralowpowerimplementationofhumanactivityrecognition
AT jafariroozbeh simulationforamemsbasedctrnnultralowpowerimplementationofhumanactivityrecognition
AT pourkamalisiavash simulationforamemsbasedctrnnultralowpowerimplementationofhumanactivityrecognition
AT alsaleemfadi simulationforamemsbasedctrnnultralowpowerimplementationofhumanactivityrecognition