Cargando…
Human action recognition based on HOIRM feature fusion and AP clustering BOW
In this paper, we propose a human action recognition method using HOIRM (histogram of oriented interest region motion) feature fusion and a BOW (bag of words) model based on AP (affinity propagation) clustering. First, a HOIRM feature extraction method based on spatiotemporal interest points ROI is...
Autores principales: | , , , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Public Library of Science
2019
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6658076/ https://www.ncbi.nlm.nih.gov/pubmed/31344042 http://dx.doi.org/10.1371/journal.pone.0219910 |
_version_ | 1783438908407676928 |
---|---|
author | Huan, Ruo-Hong Xie, Chao-Jie Guo, Feng Chi, Kai-Kai Mao, Ke-Ji Li, Ying-Long Pan, Yun |
author_facet | Huan, Ruo-Hong Xie, Chao-Jie Guo, Feng Chi, Kai-Kai Mao, Ke-Ji Li, Ying-Long Pan, Yun |
author_sort | Huan, Ruo-Hong |
collection | PubMed |
description | In this paper, we propose a human action recognition method using HOIRM (histogram of oriented interest region motion) feature fusion and a BOW (bag of words) model based on AP (affinity propagation) clustering. First, a HOIRM feature extraction method based on spatiotemporal interest points ROI is proposed. HOIRM can be regarded as a middle-level feature between local and global features. Then, HOIRM is fused with 3D HOG and 3D HOF local features using a cumulative histogram. The method further improves the robustness of local features to camera view angle and distance variations in complex scenes, which in turn improves the correct rate of action recognition. Finally, a BOW model based on AP clustering is proposed and applied to action classification. It obtains the appropriate visual dictionary capacity and achieves better clustering effect for the joint description of a variety of features. The experimental results demonstrate that by using the fused features with the proposed BOW model, the average recognition rate is 95.75% in the KTH database, and 88.25% in the UCF database, which are both higher than those by using only 3D HOG+3D HOF or HOIRM features. Moreover, the average recognition rate achieved by the proposed method in the two databases is higher than that obtained by other methods. |
format | Online Article Text |
id | pubmed-6658076 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2019 |
publisher | Public Library of Science |
record_format | MEDLINE/PubMed |
spelling | pubmed-66580762019-08-07 Human action recognition based on HOIRM feature fusion and AP clustering BOW Huan, Ruo-Hong Xie, Chao-Jie Guo, Feng Chi, Kai-Kai Mao, Ke-Ji Li, Ying-Long Pan, Yun PLoS One Research Article In this paper, we propose a human action recognition method using HOIRM (histogram of oriented interest region motion) feature fusion and a BOW (bag of words) model based on AP (affinity propagation) clustering. First, a HOIRM feature extraction method based on spatiotemporal interest points ROI is proposed. HOIRM can be regarded as a middle-level feature between local and global features. Then, HOIRM is fused with 3D HOG and 3D HOF local features using a cumulative histogram. The method further improves the robustness of local features to camera view angle and distance variations in complex scenes, which in turn improves the correct rate of action recognition. Finally, a BOW model based on AP clustering is proposed and applied to action classification. It obtains the appropriate visual dictionary capacity and achieves better clustering effect for the joint description of a variety of features. The experimental results demonstrate that by using the fused features with the proposed BOW model, the average recognition rate is 95.75% in the KTH database, and 88.25% in the UCF database, which are both higher than those by using only 3D HOG+3D HOF or HOIRM features. Moreover, the average recognition rate achieved by the proposed method in the two databases is higher than that obtained by other methods. Public Library of Science 2019-07-25 /pmc/articles/PMC6658076/ /pubmed/31344042 http://dx.doi.org/10.1371/journal.pone.0219910 Text en © 2019 Huan et al http://creativecommons.org/licenses/by/4.0/ This is an open access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0/) , which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited. |
spellingShingle | Research Article Huan, Ruo-Hong Xie, Chao-Jie Guo, Feng Chi, Kai-Kai Mao, Ke-Ji Li, Ying-Long Pan, Yun Human action recognition based on HOIRM feature fusion and AP clustering BOW |
title | Human action recognition based on HOIRM feature fusion and AP clustering BOW |
title_full | Human action recognition based on HOIRM feature fusion and AP clustering BOW |
title_fullStr | Human action recognition based on HOIRM feature fusion and AP clustering BOW |
title_full_unstemmed | Human action recognition based on HOIRM feature fusion and AP clustering BOW |
title_short | Human action recognition based on HOIRM feature fusion and AP clustering BOW |
title_sort | human action recognition based on hoirm feature fusion and ap clustering bow |
topic | Research Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6658076/ https://www.ncbi.nlm.nih.gov/pubmed/31344042 http://dx.doi.org/10.1371/journal.pone.0219910 |
work_keys_str_mv | AT huanruohong humanactionrecognitionbasedonhoirmfeaturefusionandapclusteringbow AT xiechaojie humanactionrecognitionbasedonhoirmfeaturefusionandapclusteringbow AT guofeng humanactionrecognitionbasedonhoirmfeaturefusionandapclusteringbow AT chikaikai humanactionrecognitionbasedonhoirmfeaturefusionandapclusteringbow AT maokeji humanactionrecognitionbasedonhoirmfeaturefusionandapclusteringbow AT liyinglong humanactionrecognitionbasedonhoirmfeaturefusionandapclusteringbow AT panyun humanactionrecognitionbasedonhoirmfeaturefusionandapclusteringbow |