Cargando…
A Comparative Study of Feature Selection Approaches for Human Activity Recognition Using Multimodal Sensory Data
Human activity recognition (HAR) aims to recognize the actions of the human body through a series of observations and environmental conditions. The analysis of human activities has drawn the attention of the research community in the last two decades due to its widespread applications, diverse natur...
Autores principales: | , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
MDPI
2021
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8036571/ https://www.ncbi.nlm.nih.gov/pubmed/33805368 http://dx.doi.org/10.3390/s21072368 |
_version_ | 1783676941560184832 |
---|---|
author | Amjad, Fatima Khan, Muhammad Hassan Nisar, Muhammad Adeel Farid, Muhammad Shahid Grzegorzek, Marcin |
author_facet | Amjad, Fatima Khan, Muhammad Hassan Nisar, Muhammad Adeel Farid, Muhammad Shahid Grzegorzek, Marcin |
author_sort | Amjad, Fatima |
collection | PubMed |
description | Human activity recognition (HAR) aims to recognize the actions of the human body through a series of observations and environmental conditions. The analysis of human activities has drawn the attention of the research community in the last two decades due to its widespread applications, diverse nature of activities, and recording infrastructure. Lately, one of the most challenging applications in this framework is to recognize the human body actions using unobtrusive wearable motion sensors. Since the human activities of daily life (e.g., cooking, eating) comprises several repetitive and circumstantial short sequences of actions (e.g., moving arm), it is quite difficult to directly use the sensory data for recognition because the multiple sequences of the same activity data may have large diversity. However, a similarity can be observed in the temporal occurrence of the atomic actions. Therefore, this paper presents a two-level hierarchical method to recognize human activities using a set of wearable sensors. In the first step, the atomic activities are detected from the original sensory data, and their recognition scores are obtained. Secondly, the composite activities are recognized using the scores of atomic actions. We propose two different methods of feature extraction from atomic scores to recognize the composite activities, and they include handcrafted features and the features obtained using the subspace pooling technique. The proposed method is evaluated on the large publicly available CogAge dataset, which contains the instances of both atomic and composite activities. The data is recorded using three unobtrusive wearable devices: smartphone, smartwatch, and smart glasses. We also investigated the performance evaluation of different classification algorithms to recognize the composite activities. The proposed method achieved 79% and 62.8% average recognition accuracies using the handcrafted features and the features obtained using subspace pooling technique, respectively. The recognition results of the proposed technique and their comparison with the existing state-of-the-art techniques confirm its effectiveness. |
format | Online Article Text |
id | pubmed-8036571 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2021 |
publisher | MDPI |
record_format | MEDLINE/PubMed |
spelling | pubmed-80365712021-04-12 A Comparative Study of Feature Selection Approaches for Human Activity Recognition Using Multimodal Sensory Data Amjad, Fatima Khan, Muhammad Hassan Nisar, Muhammad Adeel Farid, Muhammad Shahid Grzegorzek, Marcin Sensors (Basel) Article Human activity recognition (HAR) aims to recognize the actions of the human body through a series of observations and environmental conditions. The analysis of human activities has drawn the attention of the research community in the last two decades due to its widespread applications, diverse nature of activities, and recording infrastructure. Lately, one of the most challenging applications in this framework is to recognize the human body actions using unobtrusive wearable motion sensors. Since the human activities of daily life (e.g., cooking, eating) comprises several repetitive and circumstantial short sequences of actions (e.g., moving arm), it is quite difficult to directly use the sensory data for recognition because the multiple sequences of the same activity data may have large diversity. However, a similarity can be observed in the temporal occurrence of the atomic actions. Therefore, this paper presents a two-level hierarchical method to recognize human activities using a set of wearable sensors. In the first step, the atomic activities are detected from the original sensory data, and their recognition scores are obtained. Secondly, the composite activities are recognized using the scores of atomic actions. We propose two different methods of feature extraction from atomic scores to recognize the composite activities, and they include handcrafted features and the features obtained using the subspace pooling technique. The proposed method is evaluated on the large publicly available CogAge dataset, which contains the instances of both atomic and composite activities. The data is recorded using three unobtrusive wearable devices: smartphone, smartwatch, and smart glasses. We also investigated the performance evaluation of different classification algorithms to recognize the composite activities. The proposed method achieved 79% and 62.8% average recognition accuracies using the handcrafted features and the features obtained using subspace pooling technique, respectively. The recognition results of the proposed technique and their comparison with the existing state-of-the-art techniques confirm its effectiveness. MDPI 2021-03-29 /pmc/articles/PMC8036571/ /pubmed/33805368 http://dx.doi.org/10.3390/s21072368 Text en © 2021 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) ). |
spellingShingle | Article Amjad, Fatima Khan, Muhammad Hassan Nisar, Muhammad Adeel Farid, Muhammad Shahid Grzegorzek, Marcin A Comparative Study of Feature Selection Approaches for Human Activity Recognition Using Multimodal Sensory Data |
title | A Comparative Study of Feature Selection Approaches for Human Activity Recognition Using Multimodal Sensory Data |
title_full | A Comparative Study of Feature Selection Approaches for Human Activity Recognition Using Multimodal Sensory Data |
title_fullStr | A Comparative Study of Feature Selection Approaches for Human Activity Recognition Using Multimodal Sensory Data |
title_full_unstemmed | A Comparative Study of Feature Selection Approaches for Human Activity Recognition Using Multimodal Sensory Data |
title_short | A Comparative Study of Feature Selection Approaches for Human Activity Recognition Using Multimodal Sensory Data |
title_sort | comparative study of feature selection approaches for human activity recognition using multimodal sensory data |
topic | Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8036571/ https://www.ncbi.nlm.nih.gov/pubmed/33805368 http://dx.doi.org/10.3390/s21072368 |
work_keys_str_mv | AT amjadfatima acomparativestudyoffeatureselectionapproachesforhumanactivityrecognitionusingmultimodalsensorydata AT khanmuhammadhassan acomparativestudyoffeatureselectionapproachesforhumanactivityrecognitionusingmultimodalsensorydata AT nisarmuhammadadeel acomparativestudyoffeatureselectionapproachesforhumanactivityrecognitionusingmultimodalsensorydata AT faridmuhammadshahid acomparativestudyoffeatureselectionapproachesforhumanactivityrecognitionusingmultimodalsensorydata AT grzegorzekmarcin acomparativestudyoffeatureselectionapproachesforhumanactivityrecognitionusingmultimodalsensorydata AT amjadfatima comparativestudyoffeatureselectionapproachesforhumanactivityrecognitionusingmultimodalsensorydata AT khanmuhammadhassan comparativestudyoffeatureselectionapproachesforhumanactivityrecognitionusingmultimodalsensorydata AT nisarmuhammadadeel comparativestudyoffeatureselectionapproachesforhumanactivityrecognitionusingmultimodalsensorydata AT faridmuhammadshahid comparativestudyoffeatureselectionapproachesforhumanactivityrecognitionusingmultimodalsensorydata AT grzegorzekmarcin comparativestudyoffeatureselectionapproachesforhumanactivityrecognitionusingmultimodalsensorydata |