Cargando…

Recurrent Transformation of Prior Knowledge Based Model for Human Motion Recognition

Motion related human activity recognition using wearable sensors can potentially enable various useful daily applications. So far, most studies view it as a stand-alone mathematical classification problem without considering the physical nature and temporal information of human motions. Consequently...

Descripción completa

Detalles Bibliográficos
Autores principales: Xu, Cheng, He, Jie, Zhang, Xiaotong, Cai, Haipiao, Duan, Shihong, Tseng, Po-Hsuan, Li, Chong
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Hindawi 2018
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5820668/
https://www.ncbi.nlm.nih.gov/pubmed/29568309
http://dx.doi.org/10.1155/2018/4160652
Descripción
Sumario:Motion related human activity recognition using wearable sensors can potentially enable various useful daily applications. So far, most studies view it as a stand-alone mathematical classification problem without considering the physical nature and temporal information of human motions. Consequently, they suffer from data dependencies and encounter the curse of dimension and the overfitting issue. Their models are hard to be intuitively understood. Given a specific motion set, if structured domain knowledge could be manually obtained, it could be used for better recognizing certain motions. In this study, we start from a deep analysis on natural physical properties and temporal recurrent transformation possibilities of human motions and then propose a useful Recurrent Transformation Prior Knowledge-based Decision Tree (RT-PKDT) model for recognition of specific human motions. RT-PKDT utilizes temporal information and hierarchical classification method, making the most of sensor streaming data and human knowledge to compensate the possible data inadequacy. The experiment results indicate that the proposed method performs superior to those adopted in related works, such as SVM, BP neural networks, and Bayesian Network, obtaining an accuracy of 96.68%.