Cargando…

Multi-Perspective Representation to Part-Based Graph for Group Activity Recognition

Group activity recognition that infers the activity of a group of people is a challenging task and has received a great deal of interest in recent years. Different from individual action recognition, group activity recognition needs to model not only the visual cues of individuals but also the relat...

Descripción completa

Detalles Bibliográficos
Autores principales: Wu, Lifang, Lang, Xianglong, Xiang, Ye, Wang, Qi, Tian, Meng
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9371107/
https://www.ncbi.nlm.nih.gov/pubmed/35898025
http://dx.doi.org/10.3390/s22155521
_version_ 1784767033004523520
author Wu, Lifang
Lang, Xianglong
Xiang, Ye
Wang, Qi
Tian, Meng
author_facet Wu, Lifang
Lang, Xianglong
Xiang, Ye
Wang, Qi
Tian, Meng
author_sort Wu, Lifang
collection PubMed
description Group activity recognition that infers the activity of a group of people is a challenging task and has received a great deal of interest in recent years. Different from individual action recognition, group activity recognition needs to model not only the visual cues of individuals but also the relationships between them. The existing approaches inferred relations based on the holistic features of the individual. However, parts of the human body, such as the head, hands, legs, and their relationships, are the critical cues in most group activities. In this paper, we establish the part-based graphs from different viewpoints. The intra-actor part graph is designed to model the spatial relations of different parts for an individual, and the inter-actor part graph is proposed to explore part-level relations among actors, in which visual relation and location relation are both considered. Furthermore, a two-branch framework is utilized to capture the static spatial and dynamic temporal representations simultaneously. On the Volleyball Dataset, our approach obtains a classification accuracy of 94.8%, achieving very competitive performance in comparison with the state of the art. As for the Collective Activity Dataset, our approach improves the accuracy by 0.3% compared with the state-of-the-art results.
format Online
Article
Text
id pubmed-9371107
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-93711072022-08-12 Multi-Perspective Representation to Part-Based Graph for Group Activity Recognition Wu, Lifang Lang, Xianglong Xiang, Ye Wang, Qi Tian, Meng Sensors (Basel) Article Group activity recognition that infers the activity of a group of people is a challenging task and has received a great deal of interest in recent years. Different from individual action recognition, group activity recognition needs to model not only the visual cues of individuals but also the relationships between them. The existing approaches inferred relations based on the holistic features of the individual. However, parts of the human body, such as the head, hands, legs, and their relationships, are the critical cues in most group activities. In this paper, we establish the part-based graphs from different viewpoints. The intra-actor part graph is designed to model the spatial relations of different parts for an individual, and the inter-actor part graph is proposed to explore part-level relations among actors, in which visual relation and location relation are both considered. Furthermore, a two-branch framework is utilized to capture the static spatial and dynamic temporal representations simultaneously. On the Volleyball Dataset, our approach obtains a classification accuracy of 94.8%, achieving very competitive performance in comparison with the state of the art. As for the Collective Activity Dataset, our approach improves the accuracy by 0.3% compared with the state-of-the-art results. MDPI 2022-07-24 /pmc/articles/PMC9371107/ /pubmed/35898025 http://dx.doi.org/10.3390/s22155521 Text en © 2022 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Wu, Lifang
Lang, Xianglong
Xiang, Ye
Wang, Qi
Tian, Meng
Multi-Perspective Representation to Part-Based Graph for Group Activity Recognition
title Multi-Perspective Representation to Part-Based Graph for Group Activity Recognition
title_full Multi-Perspective Representation to Part-Based Graph for Group Activity Recognition
title_fullStr Multi-Perspective Representation to Part-Based Graph for Group Activity Recognition
title_full_unstemmed Multi-Perspective Representation to Part-Based Graph for Group Activity Recognition
title_short Multi-Perspective Representation to Part-Based Graph for Group Activity Recognition
title_sort multi-perspective representation to part-based graph for group activity recognition
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9371107/
https://www.ncbi.nlm.nih.gov/pubmed/35898025
http://dx.doi.org/10.3390/s22155521
work_keys_str_mv AT wulifang multiperspectiverepresentationtopartbasedgraphforgroupactivityrecognition
AT langxianglong multiperspectiverepresentationtopartbasedgraphforgroupactivityrecognition
AT xiangye multiperspectiverepresentationtopartbasedgraphforgroupactivityrecognition
AT wangqi multiperspectiverepresentationtopartbasedgraphforgroupactivityrecognition
AT tianmeng multiperspectiverepresentationtopartbasedgraphforgroupactivityrecognition