Cargando…

Egocentric vision-based detection of surfaces: towards context-aware free-living digital biomarkers for gait and fall risk assessment

BACKGROUND: Falls in older adults are a critical public health problem. As a means to assess fall risks, free-living digital biomarkers (FLDBs), including spatiotemporal gait measures, drawn from wearable inertial measurement unit (IMU) data have been investigated to identify those at high risk. Alt...

Descripción completa

Detalles Bibliográficos
Autores principales: Nouredanesh, Mina, Godfrey, Alan, Powell, Dylan, Tung, James
Formato: Online Artículo Texto
Lenguaje:English
Publicado: BioMed Central 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9308210/
https://www.ncbi.nlm.nih.gov/pubmed/35869527
http://dx.doi.org/10.1186/s12984-022-01022-6
_version_ 1784752938396155904
author Nouredanesh, Mina
Godfrey, Alan
Powell, Dylan
Tung, James
author_facet Nouredanesh, Mina
Godfrey, Alan
Powell, Dylan
Tung, James
author_sort Nouredanesh, Mina
collection PubMed
description BACKGROUND: Falls in older adults are a critical public health problem. As a means to assess fall risks, free-living digital biomarkers (FLDBs), including spatiotemporal gait measures, drawn from wearable inertial measurement unit (IMU) data have been investigated to identify those at high risk. Although gait-related FLDBs can be impacted by intrinsic (e.g., gait impairment) and/or environmental (e.g., walking surfaces) factors, their respective impacts have not been differentiated by the majority of free-living fall risk assessment methods. This may lead to the ambiguous interpretation of the subsequent FLDBs, and therefore, less precise intervention strategies to prevent falls. METHODS: With the aim of improving the interpretability of gait-related FLDBs and investigating the impact of environment on older adults’ gait, a vision-based framework was proposed to automatically detect the most common level walking surfaces. Using a belt-mounted camera and IMUs worn by fallers and non-fallers (mean age 73.6 yrs), a unique dataset (i.e., Multimodal Ambulatory Gait and Fall Risk Assessment in the Wild (MAGFRA-W)) was acquired. The frames and image patches attributed to nine participants’ gait were annotated: (a) outdoor terrains: pavement (asphalt, cement, outdoor bricks/tiles), gravel, grass/foliage, soil, snow/slush; and (b) indoor terrains: high-friction materials (e.g., carpet, laminated floor), wood, and tiles. A series of ConvNets were developed: EgoPlaceNet categorizes frames into indoor and outdoor; and EgoTerrainNet (with outdoor and indoor versions) detects the enclosed terrain type in patches. To improve the framework’s generalizability, an independent training dataset with 9,424 samples was curated from different databases including GTOS and MINC-2500, and used for pretrained models’ (e.g., MobileNetV2) fine-tuning. RESULTS: EgoPlaceNet detected outdoor and indoor scenes in MAGFRA-W with 97.36[Formula: see text] and 95.59[Formula: see text] (leave-one-subject-out) accuracies, respectively. EgoTerrainNet-Indoor and -Outdoor achieved high detection accuracies for pavement (87.63[Formula: see text] ), foliage (91.24[Formula: see text] ), gravel (95.12[Formula: see text] ), and high-friction materials (95.02[Formula: see text] ), which indicate the models’ high generalizabiliy. CONCLUSIONS: Encouraging results suggest that the integration of wearable cameras and deep learning approaches can provide objective contextual information in an automated manner, towards context-aware FLDBs for gait and fall risk assessment in the wild. SUPPLEMENTARY INFORMATION: The online version contains supplementary material available at 10.1186/s12984-022-01022-6.
format Online
Article
Text
id pubmed-9308210
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher BioMed Central
record_format MEDLINE/PubMed
spelling pubmed-93082102022-07-24 Egocentric vision-based detection of surfaces: towards context-aware free-living digital biomarkers for gait and fall risk assessment Nouredanesh, Mina Godfrey, Alan Powell, Dylan Tung, James J Neuroeng Rehabil Research BACKGROUND: Falls in older adults are a critical public health problem. As a means to assess fall risks, free-living digital biomarkers (FLDBs), including spatiotemporal gait measures, drawn from wearable inertial measurement unit (IMU) data have been investigated to identify those at high risk. Although gait-related FLDBs can be impacted by intrinsic (e.g., gait impairment) and/or environmental (e.g., walking surfaces) factors, their respective impacts have not been differentiated by the majority of free-living fall risk assessment methods. This may lead to the ambiguous interpretation of the subsequent FLDBs, and therefore, less precise intervention strategies to prevent falls. METHODS: With the aim of improving the interpretability of gait-related FLDBs and investigating the impact of environment on older adults’ gait, a vision-based framework was proposed to automatically detect the most common level walking surfaces. Using a belt-mounted camera and IMUs worn by fallers and non-fallers (mean age 73.6 yrs), a unique dataset (i.e., Multimodal Ambulatory Gait and Fall Risk Assessment in the Wild (MAGFRA-W)) was acquired. The frames and image patches attributed to nine participants’ gait were annotated: (a) outdoor terrains: pavement (asphalt, cement, outdoor bricks/tiles), gravel, grass/foliage, soil, snow/slush; and (b) indoor terrains: high-friction materials (e.g., carpet, laminated floor), wood, and tiles. A series of ConvNets were developed: EgoPlaceNet categorizes frames into indoor and outdoor; and EgoTerrainNet (with outdoor and indoor versions) detects the enclosed terrain type in patches. To improve the framework’s generalizability, an independent training dataset with 9,424 samples was curated from different databases including GTOS and MINC-2500, and used for pretrained models’ (e.g., MobileNetV2) fine-tuning. RESULTS: EgoPlaceNet detected outdoor and indoor scenes in MAGFRA-W with 97.36[Formula: see text] and 95.59[Formula: see text] (leave-one-subject-out) accuracies, respectively. EgoTerrainNet-Indoor and -Outdoor achieved high detection accuracies for pavement (87.63[Formula: see text] ), foliage (91.24[Formula: see text] ), gravel (95.12[Formula: see text] ), and high-friction materials (95.02[Formula: see text] ), which indicate the models’ high generalizabiliy. CONCLUSIONS: Encouraging results suggest that the integration of wearable cameras and deep learning approaches can provide objective contextual information in an automated manner, towards context-aware FLDBs for gait and fall risk assessment in the wild. SUPPLEMENTARY INFORMATION: The online version contains supplementary material available at 10.1186/s12984-022-01022-6. BioMed Central 2022-07-22 /pmc/articles/PMC9308210/ /pubmed/35869527 http://dx.doi.org/10.1186/s12984-022-01022-6 Text en © The Author(s) 2022 https://creativecommons.org/licenses/by/4.0/Open AccessThis article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) . The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/ (https://creativecommons.org/publicdomain/zero/1.0/) ) applies to the data made available in this article, unless otherwise stated in a credit line to the data.
spellingShingle Research
Nouredanesh, Mina
Godfrey, Alan
Powell, Dylan
Tung, James
Egocentric vision-based detection of surfaces: towards context-aware free-living digital biomarkers for gait and fall risk assessment
title Egocentric vision-based detection of surfaces: towards context-aware free-living digital biomarkers for gait and fall risk assessment
title_full Egocentric vision-based detection of surfaces: towards context-aware free-living digital biomarkers for gait and fall risk assessment
title_fullStr Egocentric vision-based detection of surfaces: towards context-aware free-living digital biomarkers for gait and fall risk assessment
title_full_unstemmed Egocentric vision-based detection of surfaces: towards context-aware free-living digital biomarkers for gait and fall risk assessment
title_short Egocentric vision-based detection of surfaces: towards context-aware free-living digital biomarkers for gait and fall risk assessment
title_sort egocentric vision-based detection of surfaces: towards context-aware free-living digital biomarkers for gait and fall risk assessment
topic Research
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9308210/
https://www.ncbi.nlm.nih.gov/pubmed/35869527
http://dx.doi.org/10.1186/s12984-022-01022-6
work_keys_str_mv AT nouredaneshmina egocentricvisionbaseddetectionofsurfacestowardscontextawarefreelivingdigitalbiomarkersforgaitandfallriskassessment
AT godfreyalan egocentricvisionbaseddetectionofsurfacestowardscontextawarefreelivingdigitalbiomarkersforgaitandfallriskassessment
AT powelldylan egocentricvisionbaseddetectionofsurfacestowardscontextawarefreelivingdigitalbiomarkersforgaitandfallriskassessment
AT tungjames egocentricvisionbaseddetectionofsurfacestowardscontextawarefreelivingdigitalbiomarkersforgaitandfallriskassessment