Cargando…

A Bayesian Dynamical Approach for Human Action Recognition

We introduce a generative Bayesian switching dynamical model for action recognition in 3D skeletal data. Our model encodes highly correlated skeletal data into a few sets of low-dimensional switching temporal processes and from there decodes to the motion data and their associated action labels. We...

Descripción completa

Detalles Bibliográficos
Autores principales: Farnoosh, Amirreza, Wang, Zhouping, Zhu, Shaotong, Ostadabbas, Sarah
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2021
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8402468/
https://www.ncbi.nlm.nih.gov/pubmed/34451054
http://dx.doi.org/10.3390/s21165613
_version_ 1783745796850581504
author Farnoosh, Amirreza
Wang, Zhouping
Zhu, Shaotong
Ostadabbas, Sarah
author_facet Farnoosh, Amirreza
Wang, Zhouping
Zhu, Shaotong
Ostadabbas, Sarah
author_sort Farnoosh, Amirreza
collection PubMed
description We introduce a generative Bayesian switching dynamical model for action recognition in 3D skeletal data. Our model encodes highly correlated skeletal data into a few sets of low-dimensional switching temporal processes and from there decodes to the motion data and their associated action labels. We parameterize these temporal processes with regard to a switching deep autoregressive prior to accommodate both multimodal and higher-order nonlinear inter-dependencies. This results in a dynamical deep generative latent model that parses meaningful intrinsic states in skeletal dynamics and enables action recognition. These sequences of states provide visual and quantitative interpretations about motion primitives that gave rise to each action class, which have not been explored previously. In contrast to previous works, which often overlook temporal dynamics, our method explicitly model temporal transitions and is generative. Our experiments on two large-scale 3D skeletal datasets substantiate the superior performance of our model in comparison with the state-of-the-art methods. Specifically, our method achieved 6.3% higher action classification accuracy (by incorporating a dynamical generative framework), and 3.5% better predictive error (by employing a nonlinear second-order dynamical transition model) when compared with the best-performing competitors.
format Online
Article
Text
id pubmed-8402468
institution National Center for Biotechnology Information
language English
publishDate 2021
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-84024682021-08-29 A Bayesian Dynamical Approach for Human Action Recognition Farnoosh, Amirreza Wang, Zhouping Zhu, Shaotong Ostadabbas, Sarah Sensors (Basel) Article We introduce a generative Bayesian switching dynamical model for action recognition in 3D skeletal data. Our model encodes highly correlated skeletal data into a few sets of low-dimensional switching temporal processes and from there decodes to the motion data and their associated action labels. We parameterize these temporal processes with regard to a switching deep autoregressive prior to accommodate both multimodal and higher-order nonlinear inter-dependencies. This results in a dynamical deep generative latent model that parses meaningful intrinsic states in skeletal dynamics and enables action recognition. These sequences of states provide visual and quantitative interpretations about motion primitives that gave rise to each action class, which have not been explored previously. In contrast to previous works, which often overlook temporal dynamics, our method explicitly model temporal transitions and is generative. Our experiments on two large-scale 3D skeletal datasets substantiate the superior performance of our model in comparison with the state-of-the-art methods. Specifically, our method achieved 6.3% higher action classification accuracy (by incorporating a dynamical generative framework), and 3.5% better predictive error (by employing a nonlinear second-order dynamical transition model) when compared with the best-performing competitors. MDPI 2021-08-20 /pmc/articles/PMC8402468/ /pubmed/34451054 http://dx.doi.org/10.3390/s21165613 Text en © 2021 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Farnoosh, Amirreza
Wang, Zhouping
Zhu, Shaotong
Ostadabbas, Sarah
A Bayesian Dynamical Approach for Human Action Recognition
title A Bayesian Dynamical Approach for Human Action Recognition
title_full A Bayesian Dynamical Approach for Human Action Recognition
title_fullStr A Bayesian Dynamical Approach for Human Action Recognition
title_full_unstemmed A Bayesian Dynamical Approach for Human Action Recognition
title_short A Bayesian Dynamical Approach for Human Action Recognition
title_sort bayesian dynamical approach for human action recognition
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8402468/
https://www.ncbi.nlm.nih.gov/pubmed/34451054
http://dx.doi.org/10.3390/s21165613
work_keys_str_mv AT farnooshamirreza abayesiandynamicalapproachforhumanactionrecognition
AT wangzhouping abayesiandynamicalapproachforhumanactionrecognition
AT zhushaotong abayesiandynamicalapproachforhumanactionrecognition
AT ostadabbassarah abayesiandynamicalapproachforhumanactionrecognition
AT farnooshamirreza bayesiandynamicalapproachforhumanactionrecognition
AT wangzhouping bayesiandynamicalapproachforhumanactionrecognition
AT zhushaotong bayesiandynamicalapproachforhumanactionrecognition
AT ostadabbassarah bayesiandynamicalapproachforhumanactionrecognition