Cargando…

Fully automated image-based estimation of postural point-features in children with cerebral palsy using deep learning

The aim of this study was to provide automated identification of postural point-features required to estimate the location and orientation of the head, multi-segmented trunk and arms from videos of the clinical test ‘Segmental Assessment of Trunk Control’ (SATCo). Three expert operators manually ann...

Descripción completa

Detalles Bibliográficos
Autores principales: Cunningham, Ryan, Sánchez, María B., Butler, Penelope B., Southgate, Matthew J., Loram, Ian D.
Formato: Online Artículo Texto
Lenguaje:English
Publicado: The Royal Society 2019
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6894590/
https://www.ncbi.nlm.nih.gov/pubmed/31827842
http://dx.doi.org/10.1098/rsos.191011
Descripción
Sumario:The aim of this study was to provide automated identification of postural point-features required to estimate the location and orientation of the head, multi-segmented trunk and arms from videos of the clinical test ‘Segmental Assessment of Trunk Control’ (SATCo). Three expert operators manually annotated 13 point-features in every fourth image of 177 short (5–10 s) videos (25 Hz) of 12 children with cerebral palsy (aged: 4.52 ± 2.4 years), participating in SATCo testing. Linear interpolation for the remaining images resulted in 30 825 annotated images. Convolutional neural networks were trained with cross-validation, giving held-out test results for all children. The point-features were estimated with error 4.4 ± 3.8 pixels at approximately 100 images per second. Truncal segment angles (head, neck and six thoraco-lumbar–pelvic segments) were estimated with error 6.4 ± 2.8°, allowing accurate classification (F(1) > 80%) of deviation from a reference posture at thresholds up to 3°, 3° and 2°, respectively. Contact between arm point-features (elbow and wrist) and supporting surface was classified at F(1) = 80.5%. This study demonstrates, for the first time, technical feasibility to automate the identification of (i) a sitting segmental posture including individual trunk segments, (ii) changes away from that posture, and (iii) support from the upper limb, required for the clinical SATCo.