Cargando…

Algorithm based on one monocular video delivers highly valid and reliable gait parameters

Despite its paramount importance for manifold use cases (e.g., in the health care industry, sports, rehabilitation and fitness assessment), sufficiently valid and reliable gait parameter measurement is still limited to high-tech gait laboratories mostly. Here, we demonstrate the excellent validity a...

Descripción completa

Detalles Bibliográficos
Autores principales: Azhand, Arash, Rabe, Sophie, Müller, Swantje, Sattler, Igor, Heimann-Steinert, Anika
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Nature Publishing Group UK 2021
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8263606/
https://www.ncbi.nlm.nih.gov/pubmed/34234255
http://dx.doi.org/10.1038/s41598-021-93530-z
_version_ 1783719420445589504
author Azhand, Arash
Rabe, Sophie
Müller, Swantje
Sattler, Igor
Heimann-Steinert, Anika
author_facet Azhand, Arash
Rabe, Sophie
Müller, Swantje
Sattler, Igor
Heimann-Steinert, Anika
author_sort Azhand, Arash
collection PubMed
description Despite its paramount importance for manifold use cases (e.g., in the health care industry, sports, rehabilitation and fitness assessment), sufficiently valid and reliable gait parameter measurement is still limited to high-tech gait laboratories mostly. Here, we demonstrate the excellent validity and test–retest repeatability of a novel gait assessment system which is built upon modern convolutional neural networks to extract three-dimensional skeleton joints from monocular frontal-view videos of walking humans. The validity study is based on a comparison to the GAITRite pressure-sensitive walkway system. All measured gait parameters (gait speed, cadence, step length and step time) showed excellent concurrent validity for multiple walk trials at normal and fast gait speeds. The test–retest-repeatability is on the same level as the GAITRite system. In conclusion, we are convinced that our results can pave the way for cost, space and operationally effective gait analysis in broad mainstream applications. Most sensor-based systems are costly, must be operated by extensively trained personnel (e.g., motion capture systems) or—even if not quite as costly—still possess considerable complexity (e.g., wearable sensors). In contrast, a video sufficient for the assessment method presented here can be obtained by anyone, without much training, via a smartphone camera.
format Online
Article
Text
id pubmed-8263606
institution National Center for Biotechnology Information
language English
publishDate 2021
publisher Nature Publishing Group UK
record_format MEDLINE/PubMed
spelling pubmed-82636062021-07-09 Algorithm based on one monocular video delivers highly valid and reliable gait parameters Azhand, Arash Rabe, Sophie Müller, Swantje Sattler, Igor Heimann-Steinert, Anika Sci Rep Article Despite its paramount importance for manifold use cases (e.g., in the health care industry, sports, rehabilitation and fitness assessment), sufficiently valid and reliable gait parameter measurement is still limited to high-tech gait laboratories mostly. Here, we demonstrate the excellent validity and test–retest repeatability of a novel gait assessment system which is built upon modern convolutional neural networks to extract three-dimensional skeleton joints from monocular frontal-view videos of walking humans. The validity study is based on a comparison to the GAITRite pressure-sensitive walkway system. All measured gait parameters (gait speed, cadence, step length and step time) showed excellent concurrent validity for multiple walk trials at normal and fast gait speeds. The test–retest-repeatability is on the same level as the GAITRite system. In conclusion, we are convinced that our results can pave the way for cost, space and operationally effective gait analysis in broad mainstream applications. Most sensor-based systems are costly, must be operated by extensively trained personnel (e.g., motion capture systems) or—even if not quite as costly—still possess considerable complexity (e.g., wearable sensors). In contrast, a video sufficient for the assessment method presented here can be obtained by anyone, without much training, via a smartphone camera. Nature Publishing Group UK 2021-07-07 /pmc/articles/PMC8263606/ /pubmed/34234255 http://dx.doi.org/10.1038/s41598-021-93530-z Text en © The Author(s) 2021 https://creativecommons.org/licenses/by/4.0/Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) .
spellingShingle Article
Azhand, Arash
Rabe, Sophie
Müller, Swantje
Sattler, Igor
Heimann-Steinert, Anika
Algorithm based on one monocular video delivers highly valid and reliable gait parameters
title Algorithm based on one monocular video delivers highly valid and reliable gait parameters
title_full Algorithm based on one monocular video delivers highly valid and reliable gait parameters
title_fullStr Algorithm based on one monocular video delivers highly valid and reliable gait parameters
title_full_unstemmed Algorithm based on one monocular video delivers highly valid and reliable gait parameters
title_short Algorithm based on one monocular video delivers highly valid and reliable gait parameters
title_sort algorithm based on one monocular video delivers highly valid and reliable gait parameters
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8263606/
https://www.ncbi.nlm.nih.gov/pubmed/34234255
http://dx.doi.org/10.1038/s41598-021-93530-z
work_keys_str_mv AT azhandarash algorithmbasedononemonocularvideodelivershighlyvalidandreliablegaitparameters
AT rabesophie algorithmbasedononemonocularvideodelivershighlyvalidandreliablegaitparameters
AT mullerswantje algorithmbasedononemonocularvideodelivershighlyvalidandreliablegaitparameters
AT sattlerigor algorithmbasedononemonocularvideodelivershighlyvalidandreliablegaitparameters
AT heimannsteinertanika algorithmbasedononemonocularvideodelivershighlyvalidandreliablegaitparameters