Cargando…

Learning Gait Representations with Noisy Multi-Task Learning

Gait analysis is proven to be a reliable way to perform person identification without relying on subject cooperation. Walking is a biometric that does not significantly change in short periods of time and can be regarded as unique to each person. So far, the study of gait analysis focused mostly on...

Descripción completa

Detalles Bibliográficos
Autores principales: Cosma, Adrian, Radoi, Emilian
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9506362/
https://www.ncbi.nlm.nih.gov/pubmed/36146152
http://dx.doi.org/10.3390/s22186803
_version_ 1784796704477806592
author Cosma, Adrian
Radoi, Emilian
author_facet Cosma, Adrian
Radoi, Emilian
author_sort Cosma, Adrian
collection PubMed
description Gait analysis is proven to be a reliable way to perform person identification without relying on subject cooperation. Walking is a biometric that does not significantly change in short periods of time and can be regarded as unique to each person. So far, the study of gait analysis focused mostly on identification and demographics estimation, without considering many of the pedestrian attributes that appearance-based methods rely on. In this work, alongside gait-based person identification, we explore pedestrian attribute identification solely from movement patterns. We propose DenseGait, the largest dataset for pretraining gait analysis systems containing 217 K anonymized tracklets, annotated automatically with 42 appearance attributes. DenseGait is constructed by automatically processing video streams and offers the full array of gait covariates present in the real world. We make the dataset available to the research community. Additionally, we propose GaitFormer, a transformer-based model that after pretraining in a multi-task fashion on DenseGait, achieves 92.5% accuracy on CASIA-B and 85.33% on FVG, without utilizing any manually annotated data. This corresponds to a +14.2% and +9.67% accuracy increase compared to similar methods. Moreover, GaitFormer is able to accurately identify gender information and a multitude of appearance attributes utilizing only movement patterns. The code to reproduce the experiments is made publicly.
format Online
Article
Text
id pubmed-9506362
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-95063622022-09-24 Learning Gait Representations with Noisy Multi-Task Learning Cosma, Adrian Radoi, Emilian Sensors (Basel) Article Gait analysis is proven to be a reliable way to perform person identification without relying on subject cooperation. Walking is a biometric that does not significantly change in short periods of time and can be regarded as unique to each person. So far, the study of gait analysis focused mostly on identification and demographics estimation, without considering many of the pedestrian attributes that appearance-based methods rely on. In this work, alongside gait-based person identification, we explore pedestrian attribute identification solely from movement patterns. We propose DenseGait, the largest dataset for pretraining gait analysis systems containing 217 K anonymized tracklets, annotated automatically with 42 appearance attributes. DenseGait is constructed by automatically processing video streams and offers the full array of gait covariates present in the real world. We make the dataset available to the research community. Additionally, we propose GaitFormer, a transformer-based model that after pretraining in a multi-task fashion on DenseGait, achieves 92.5% accuracy on CASIA-B and 85.33% on FVG, without utilizing any manually annotated data. This corresponds to a +14.2% and +9.67% accuracy increase compared to similar methods. Moreover, GaitFormer is able to accurately identify gender information and a multitude of appearance attributes utilizing only movement patterns. The code to reproduce the experiments is made publicly. MDPI 2022-09-08 /pmc/articles/PMC9506362/ /pubmed/36146152 http://dx.doi.org/10.3390/s22186803 Text en © 2022 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Cosma, Adrian
Radoi, Emilian
Learning Gait Representations with Noisy Multi-Task Learning
title Learning Gait Representations with Noisy Multi-Task Learning
title_full Learning Gait Representations with Noisy Multi-Task Learning
title_fullStr Learning Gait Representations with Noisy Multi-Task Learning
title_full_unstemmed Learning Gait Representations with Noisy Multi-Task Learning
title_short Learning Gait Representations with Noisy Multi-Task Learning
title_sort learning gait representations with noisy multi-task learning
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9506362/
https://www.ncbi.nlm.nih.gov/pubmed/36146152
http://dx.doi.org/10.3390/s22186803
work_keys_str_mv AT cosmaadrian learninggaitrepresentationswithnoisymultitasklearning
AT radoiemilian learninggaitrepresentationswithnoisymultitasklearning