Cargando…
Automated Implementation of the Edinburgh Visual Gait Score (EVGS) Using OpenPose and Handheld Smartphone Video
Recent advancements in computing and artificial intelligence (AI) make it possible to quantitatively evaluate human movement using digital video, thereby opening the possibility of more accessible gait analysis. The Edinburgh Visual Gait Score (EVGS) is an effective tool for observational gait analy...
Autores principales: | , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
MDPI
2023
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10220686/ https://www.ncbi.nlm.nih.gov/pubmed/37430751 http://dx.doi.org/10.3390/s23104839 |
_version_ | 1785049277091807232 |
---|---|
author | Ramesh, Shri Harini Lemaire, Edward D. Tu, Albert Cheung, Kevin Baddour, Natalie |
author_facet | Ramesh, Shri Harini Lemaire, Edward D. Tu, Albert Cheung, Kevin Baddour, Natalie |
author_sort | Ramesh, Shri Harini |
collection | PubMed |
description | Recent advancements in computing and artificial intelligence (AI) make it possible to quantitatively evaluate human movement using digital video, thereby opening the possibility of more accessible gait analysis. The Edinburgh Visual Gait Score (EVGS) is an effective tool for observational gait analysis, but human scoring of videos can take over 20 min and requires experienced observers. This research developed an algorithmic implementation of the EVGS from handheld smartphone video to enable automatic scoring. Participant walking was video recorded at 60 Hz using a smartphone, and body keypoints were identified using the OpenPose BODY25 pose estimation model. An algorithm was developed to identify foot events and strides, and EVGS parameters were determined at relevant gait events. Stride detection was accurate within two to five frames. The level of agreement between the algorithmic and human reviewer EVGS results was strong for 14 of 17 parameters, and the algorithmic EVGS results were highly correlated (r > 0.80, “r” represents the Pearson correlation coefficient) to the ground truth values for 8 of the 17 parameters. This approach could make gait analysis more accessible and cost-effective, particularly in areas without gait assessment expertise. These findings pave the way for future studies to explore the use of smartphone video and AI algorithms in remote gait analysis. |
format | Online Article Text |
id | pubmed-10220686 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2023 |
publisher | MDPI |
record_format | MEDLINE/PubMed |
spelling | pubmed-102206862023-05-28 Automated Implementation of the Edinburgh Visual Gait Score (EVGS) Using OpenPose and Handheld Smartphone Video Ramesh, Shri Harini Lemaire, Edward D. Tu, Albert Cheung, Kevin Baddour, Natalie Sensors (Basel) Article Recent advancements in computing and artificial intelligence (AI) make it possible to quantitatively evaluate human movement using digital video, thereby opening the possibility of more accessible gait analysis. The Edinburgh Visual Gait Score (EVGS) is an effective tool for observational gait analysis, but human scoring of videos can take over 20 min and requires experienced observers. This research developed an algorithmic implementation of the EVGS from handheld smartphone video to enable automatic scoring. Participant walking was video recorded at 60 Hz using a smartphone, and body keypoints were identified using the OpenPose BODY25 pose estimation model. An algorithm was developed to identify foot events and strides, and EVGS parameters were determined at relevant gait events. Stride detection was accurate within two to five frames. The level of agreement between the algorithmic and human reviewer EVGS results was strong for 14 of 17 parameters, and the algorithmic EVGS results were highly correlated (r > 0.80, “r” represents the Pearson correlation coefficient) to the ground truth values for 8 of the 17 parameters. This approach could make gait analysis more accessible and cost-effective, particularly in areas without gait assessment expertise. These findings pave the way for future studies to explore the use of smartphone video and AI algorithms in remote gait analysis. MDPI 2023-05-17 /pmc/articles/PMC10220686/ /pubmed/37430751 http://dx.doi.org/10.3390/s23104839 Text en © 2023 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). |
spellingShingle | Article Ramesh, Shri Harini Lemaire, Edward D. Tu, Albert Cheung, Kevin Baddour, Natalie Automated Implementation of the Edinburgh Visual Gait Score (EVGS) Using OpenPose and Handheld Smartphone Video |
title | Automated Implementation of the Edinburgh Visual Gait Score (EVGS) Using OpenPose and Handheld Smartphone Video |
title_full | Automated Implementation of the Edinburgh Visual Gait Score (EVGS) Using OpenPose and Handheld Smartphone Video |
title_fullStr | Automated Implementation of the Edinburgh Visual Gait Score (EVGS) Using OpenPose and Handheld Smartphone Video |
title_full_unstemmed | Automated Implementation of the Edinburgh Visual Gait Score (EVGS) Using OpenPose and Handheld Smartphone Video |
title_short | Automated Implementation of the Edinburgh Visual Gait Score (EVGS) Using OpenPose and Handheld Smartphone Video |
title_sort | automated implementation of the edinburgh visual gait score (evgs) using openpose and handheld smartphone video |
topic | Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10220686/ https://www.ncbi.nlm.nih.gov/pubmed/37430751 http://dx.doi.org/10.3390/s23104839 |
work_keys_str_mv | AT rameshshriharini automatedimplementationoftheedinburghvisualgaitscoreevgsusingopenposeandhandheldsmartphonevideo AT lemaireedwardd automatedimplementationoftheedinburghvisualgaitscoreevgsusingopenposeandhandheldsmartphonevideo AT tualbert automatedimplementationoftheedinburghvisualgaitscoreevgsusingopenposeandhandheldsmartphonevideo AT cheungkevin automatedimplementationoftheedinburghvisualgaitscoreevgsusingopenposeandhandheldsmartphonevideo AT baddournatalie automatedimplementationoftheedinburghvisualgaitscoreevgsusingopenposeandhandheldsmartphonevideo |