Cargando…
Real-time automated detection of older adults' hand gestures in home and clinical settings
There is an urgent need, accelerated by the COVID-19 pandemic, for methods that allow clinicians and neuroscientists to remotely evaluate hand movements. This would help detect and monitor degenerative brain disorders that are particularly prevalent in older adults. With the wide accessibility of co...
Autores principales: | , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Springer London
2022
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9741488/ https://www.ncbi.nlm.nih.gov/pubmed/36532882 http://dx.doi.org/10.1007/s00521-022-08090-8 |
_version_ | 1784848332817956864 |
---|---|
author | Huang, Guan Tran, Son N. Bai, Quan Alty, Jane |
author_facet | Huang, Guan Tran, Son N. Bai, Quan Alty, Jane |
author_sort | Huang, Guan |
collection | PubMed |
description | There is an urgent need, accelerated by the COVID-19 pandemic, for methods that allow clinicians and neuroscientists to remotely evaluate hand movements. This would help detect and monitor degenerative brain disorders that are particularly prevalent in older adults. With the wide accessibility of computer cameras, a vision-based real-time hand gesture detection method would facilitate online assessments in home and clinical settings. However, motion blur is one of the most challenging problems in the fast-moving hands data collection. The objective of this study was to develop a computer vision-based method that accurately detects older adults’ hand gestures using video data collected in real-life settings. We invited adults over 50 years old to complete validated hand movement tests (fast finger tapping and hand opening–closing) at home or in clinic. Data were collected without researcher supervision via a website programme using standard laptop and desktop cameras. We processed and labelled images, split the data into training, validation and testing, respectively, and then analysed how well different network structures detected hand gestures. We recruited 1,900 adults (age range 50–90 years) as part of the TAS Test project and developed UTAS7k—a new dataset of 7071 hand gesture images, split 4:1 into clear: motion-blurred images. Our new network, RGRNet, achieved 0.782 mean average precision (mAP) on clear images, outperforming the state-of-the-art network structure (YOLOV5-P6, mAP 0.776), and mAP 0.771 on blurred images. A new robust real-time automated network that detects static gestures from a single camera, RGRNet, and a new database comprising the largest range of individual hands, UTAS7k, both show strong potential for medical and research applications. SUPPLEMENTARY INFORMATION: The online version contains supplementary material available at 10.1007/s00521-022-08090-8. |
format | Online Article Text |
id | pubmed-9741488 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2022 |
publisher | Springer London |
record_format | MEDLINE/PubMed |
spelling | pubmed-97414882022-12-12 Real-time automated detection of older adults' hand gestures in home and clinical settings Huang, Guan Tran, Son N. Bai, Quan Alty, Jane Neural Comput Appl Original Article There is an urgent need, accelerated by the COVID-19 pandemic, for methods that allow clinicians and neuroscientists to remotely evaluate hand movements. This would help detect and monitor degenerative brain disorders that are particularly prevalent in older adults. With the wide accessibility of computer cameras, a vision-based real-time hand gesture detection method would facilitate online assessments in home and clinical settings. However, motion blur is one of the most challenging problems in the fast-moving hands data collection. The objective of this study was to develop a computer vision-based method that accurately detects older adults’ hand gestures using video data collected in real-life settings. We invited adults over 50 years old to complete validated hand movement tests (fast finger tapping and hand opening–closing) at home or in clinic. Data were collected without researcher supervision via a website programme using standard laptop and desktop cameras. We processed and labelled images, split the data into training, validation and testing, respectively, and then analysed how well different network structures detected hand gestures. We recruited 1,900 adults (age range 50–90 years) as part of the TAS Test project and developed UTAS7k—a new dataset of 7071 hand gesture images, split 4:1 into clear: motion-blurred images. Our new network, RGRNet, achieved 0.782 mean average precision (mAP) on clear images, outperforming the state-of-the-art network structure (YOLOV5-P6, mAP 0.776), and mAP 0.771 on blurred images. A new robust real-time automated network that detects static gestures from a single camera, RGRNet, and a new database comprising the largest range of individual hands, UTAS7k, both show strong potential for medical and research applications. SUPPLEMENTARY INFORMATION: The online version contains supplementary material available at 10.1007/s00521-022-08090-8. Springer London 2022-12-10 2023 /pmc/articles/PMC9741488/ /pubmed/36532882 http://dx.doi.org/10.1007/s00521-022-08090-8 Text en © The Author(s), under exclusive licence to Springer-Verlag London Ltd., part of Springer Nature 2022, Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law. This article is made available via the PMC Open Access Subset for unrestricted research re-use and secondary analysis in any form or by any means with acknowledgement of the original source. These permissions are granted for the duration of the World Health Organization (WHO) declaration of COVID-19 as a global pandemic. |
spellingShingle | Original Article Huang, Guan Tran, Son N. Bai, Quan Alty, Jane Real-time automated detection of older adults' hand gestures in home and clinical settings |
title | Real-time automated detection of older adults' hand gestures in home and clinical settings |
title_full | Real-time automated detection of older adults' hand gestures in home and clinical settings |
title_fullStr | Real-time automated detection of older adults' hand gestures in home and clinical settings |
title_full_unstemmed | Real-time automated detection of older adults' hand gestures in home and clinical settings |
title_short | Real-time automated detection of older adults' hand gestures in home and clinical settings |
title_sort | real-time automated detection of older adults' hand gestures in home and clinical settings |
topic | Original Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9741488/ https://www.ncbi.nlm.nih.gov/pubmed/36532882 http://dx.doi.org/10.1007/s00521-022-08090-8 |
work_keys_str_mv | AT huangguan realtimeautomateddetectionofolderadultshandgesturesinhomeandclinicalsettings AT transonn realtimeautomateddetectionofolderadultshandgesturesinhomeandclinicalsettings AT baiquan realtimeautomateddetectionofolderadultshandgesturesinhomeandclinicalsettings AT altyjane realtimeautomateddetectionofolderadultshandgesturesinhomeandclinicalsettings |