Cargando…

Interpretable Passive Multi-Modal Sensor Fusion for Human Identification and Activity Recognition

Human monitoring applications in indoor environments depend on accurate human identification and activity recognition (HIAR). Single modality sensor systems have shown to be accurate for HIAR, but there are some shortcomings to these systems, such as privacy, intrusion, and costs. To combat these sh...

Descripción completa

Detalles Bibliográficos
Autores principales: Yuan, Liangqi, Andrews, Jack, Mu, Huaizheng, Vakil, Asad, Ewing, Robert, Blasch, Erik, Li, Jia
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9371208/
https://www.ncbi.nlm.nih.gov/pubmed/35957343
http://dx.doi.org/10.3390/s22155787
_version_ 1784767067246821376
author Yuan, Liangqi
Andrews, Jack
Mu, Huaizheng
Vakil, Asad
Ewing, Robert
Blasch, Erik
Li, Jia
author_facet Yuan, Liangqi
Andrews, Jack
Mu, Huaizheng
Vakil, Asad
Ewing, Robert
Blasch, Erik
Li, Jia
author_sort Yuan, Liangqi
collection PubMed
description Human monitoring applications in indoor environments depend on accurate human identification and activity recognition (HIAR). Single modality sensor systems have shown to be accurate for HIAR, but there are some shortcomings to these systems, such as privacy, intrusion, and costs. To combat these shortcomings for a long-term monitoring solution, an interpretable, passive, multi-modal, sensor fusion system PRF-PIR is proposed in this work. PRF-PIR is composed of one software-defined radio (SDR) device and one novel passive infrared (PIR) sensor system. A recurrent neural network (RNN) is built as the HIAR model for this proposed solution to handle the temporal dependence of passive information captured by both modalities. We validate our proposed PRF-PIR system for a potential human monitoring system through the data collection of eleven activities from twelve human subjects in an academic office environment. From our data collection, the efficacy of the sensor fusion system is proven via an accuracy of 0.9866 for human identification and an accuracy of 0.9623 for activity recognition. The results of the system are supported with explainable artificial intelligence (XAI) methodologies to serve as a validation for sensor fusion over the deployment of single sensor solutions. PRF-PIR provides a passive, non-intrusive, and highly accurate system that allows for robustness in uncertain, highly similar, and complex at-home activities performed by a variety of human subjects.
format Online
Article
Text
id pubmed-9371208
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-93712082022-08-12 Interpretable Passive Multi-Modal Sensor Fusion for Human Identification and Activity Recognition Yuan, Liangqi Andrews, Jack Mu, Huaizheng Vakil, Asad Ewing, Robert Blasch, Erik Li, Jia Sensors (Basel) Article Human monitoring applications in indoor environments depend on accurate human identification and activity recognition (HIAR). Single modality sensor systems have shown to be accurate for HIAR, but there are some shortcomings to these systems, such as privacy, intrusion, and costs. To combat these shortcomings for a long-term monitoring solution, an interpretable, passive, multi-modal, sensor fusion system PRF-PIR is proposed in this work. PRF-PIR is composed of one software-defined radio (SDR) device and one novel passive infrared (PIR) sensor system. A recurrent neural network (RNN) is built as the HIAR model for this proposed solution to handle the temporal dependence of passive information captured by both modalities. We validate our proposed PRF-PIR system for a potential human monitoring system through the data collection of eleven activities from twelve human subjects in an academic office environment. From our data collection, the efficacy of the sensor fusion system is proven via an accuracy of 0.9866 for human identification and an accuracy of 0.9623 for activity recognition. The results of the system are supported with explainable artificial intelligence (XAI) methodologies to serve as a validation for sensor fusion over the deployment of single sensor solutions. PRF-PIR provides a passive, non-intrusive, and highly accurate system that allows for robustness in uncertain, highly similar, and complex at-home activities performed by a variety of human subjects. MDPI 2022-08-03 /pmc/articles/PMC9371208/ /pubmed/35957343 http://dx.doi.org/10.3390/s22155787 Text en © 2022 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Yuan, Liangqi
Andrews, Jack
Mu, Huaizheng
Vakil, Asad
Ewing, Robert
Blasch, Erik
Li, Jia
Interpretable Passive Multi-Modal Sensor Fusion for Human Identification and Activity Recognition
title Interpretable Passive Multi-Modal Sensor Fusion for Human Identification and Activity Recognition
title_full Interpretable Passive Multi-Modal Sensor Fusion for Human Identification and Activity Recognition
title_fullStr Interpretable Passive Multi-Modal Sensor Fusion for Human Identification and Activity Recognition
title_full_unstemmed Interpretable Passive Multi-Modal Sensor Fusion for Human Identification and Activity Recognition
title_short Interpretable Passive Multi-Modal Sensor Fusion for Human Identification and Activity Recognition
title_sort interpretable passive multi-modal sensor fusion for human identification and activity recognition
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9371208/
https://www.ncbi.nlm.nih.gov/pubmed/35957343
http://dx.doi.org/10.3390/s22155787
work_keys_str_mv AT yuanliangqi interpretablepassivemultimodalsensorfusionforhumanidentificationandactivityrecognition
AT andrewsjack interpretablepassivemultimodalsensorfusionforhumanidentificationandactivityrecognition
AT muhuaizheng interpretablepassivemultimodalsensorfusionforhumanidentificationandactivityrecognition
AT vakilasad interpretablepassivemultimodalsensorfusionforhumanidentificationandactivityrecognition
AT ewingrobert interpretablepassivemultimodalsensorfusionforhumanidentificationandactivityrecognition
AT blascherik interpretablepassivemultimodalsensorfusionforhumanidentificationandactivityrecognition
AT lijia interpretablepassivemultimodalsensorfusionforhumanidentificationandactivityrecognition