Cargando…
The use of deep learning for smartphone-based human activity recognition
The emerging field of digital phenotyping leverages the numerous sensors embedded in a smartphone to better understand its user's current psychological state and behavior, enabling improved health support systems for patients. As part of this work, a common task is to use the smartphone acceler...
Autores principales: | , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Frontiers Media S.A.
2023
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10011495/ https://www.ncbi.nlm.nih.gov/pubmed/36926170 http://dx.doi.org/10.3389/fpubh.2023.1086671 |
_version_ | 1784906406962397184 |
---|---|
author | Stampfler, Tristan Elgendi, Mohamed Fletcher, Richard Ribon Menon, Carlo |
author_facet | Stampfler, Tristan Elgendi, Mohamed Fletcher, Richard Ribon Menon, Carlo |
author_sort | Stampfler, Tristan |
collection | PubMed |
description | The emerging field of digital phenotyping leverages the numerous sensors embedded in a smartphone to better understand its user's current psychological state and behavior, enabling improved health support systems for patients. As part of this work, a common task is to use the smartphone accelerometer to automatically recognize or classify the behavior of the user, known as human activity recognition (HAR). In this article, we present a deep learning method using the Resnet architecture to implement HAR using the popular UniMiB-SHAR public dataset, containing 11,771 measurement segments from 30 users ranging in age between 18 and 60 years. We present a unified deep learning approach based on a Resnet architecture that consistently exceeds the state-of-the-art accuracy and F1-score across all classification tasks and evaluation methods mentioned in the literature. The most notable increase we disclose regards the leave-one-subject-out evaluation, known as the most rigorous evaluation method, where we push the state-of-the-art accuracy from 78.24 to 80.09% and the F1-score from 78.40 to 79.36%. For such results, we resorted to deep learning techniques, such as hyper-parameter tuning, label smoothing, and dropout, which helped regularize the Resnet training and reduced overfitting. We discuss how our approach could easily be adapted to perform HAR in real-time and discuss future research directions. |
format | Online Article Text |
id | pubmed-10011495 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2023 |
publisher | Frontiers Media S.A. |
record_format | MEDLINE/PubMed |
spelling | pubmed-100114952023-03-15 The use of deep learning for smartphone-based human activity recognition Stampfler, Tristan Elgendi, Mohamed Fletcher, Richard Ribon Menon, Carlo Front Public Health Public Health The emerging field of digital phenotyping leverages the numerous sensors embedded in a smartphone to better understand its user's current psychological state and behavior, enabling improved health support systems for patients. As part of this work, a common task is to use the smartphone accelerometer to automatically recognize or classify the behavior of the user, known as human activity recognition (HAR). In this article, we present a deep learning method using the Resnet architecture to implement HAR using the popular UniMiB-SHAR public dataset, containing 11,771 measurement segments from 30 users ranging in age between 18 and 60 years. We present a unified deep learning approach based on a Resnet architecture that consistently exceeds the state-of-the-art accuracy and F1-score across all classification tasks and evaluation methods mentioned in the literature. The most notable increase we disclose regards the leave-one-subject-out evaluation, known as the most rigorous evaluation method, where we push the state-of-the-art accuracy from 78.24 to 80.09% and the F1-score from 78.40 to 79.36%. For such results, we resorted to deep learning techniques, such as hyper-parameter tuning, label smoothing, and dropout, which helped regularize the Resnet training and reduced overfitting. We discuss how our approach could easily be adapted to perform HAR in real-time and discuss future research directions. Frontiers Media S.A. 2023-02-28 /pmc/articles/PMC10011495/ /pubmed/36926170 http://dx.doi.org/10.3389/fpubh.2023.1086671 Text en Copyright © 2023 Stampfler, Elgendi, Fletcher and Menon. https://creativecommons.org/licenses/by/4.0/This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms. |
spellingShingle | Public Health Stampfler, Tristan Elgendi, Mohamed Fletcher, Richard Ribon Menon, Carlo The use of deep learning for smartphone-based human activity recognition |
title | The use of deep learning for smartphone-based human activity recognition |
title_full | The use of deep learning for smartphone-based human activity recognition |
title_fullStr | The use of deep learning for smartphone-based human activity recognition |
title_full_unstemmed | The use of deep learning for smartphone-based human activity recognition |
title_short | The use of deep learning for smartphone-based human activity recognition |
title_sort | use of deep learning for smartphone-based human activity recognition |
topic | Public Health |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10011495/ https://www.ncbi.nlm.nih.gov/pubmed/36926170 http://dx.doi.org/10.3389/fpubh.2023.1086671 |
work_keys_str_mv | AT stampflertristan theuseofdeeplearningforsmartphonebasedhumanactivityrecognition AT elgendimohamed theuseofdeeplearningforsmartphonebasedhumanactivityrecognition AT fletcherrichardribon theuseofdeeplearningforsmartphonebasedhumanactivityrecognition AT menoncarlo theuseofdeeplearningforsmartphonebasedhumanactivityrecognition AT stampflertristan useofdeeplearningforsmartphonebasedhumanactivityrecognition AT elgendimohamed useofdeeplearningforsmartphonebasedhumanactivityrecognition AT fletcherrichardribon useofdeeplearningforsmartphonebasedhumanactivityrecognition AT menoncarlo useofdeeplearningforsmartphonebasedhumanactivityrecognition |