Cargando…
BioMAT: An Open-Source Biomechanics Multi-Activity Transformer for Joint Kinematic Predictions Using Wearable Sensors
Through wearable sensors and deep learning techniques, biomechanical analysis can reach beyond the lab for clinical and sporting applications. Transformers, a class of recent deep learning models, have become widely used in state-of-the-art artificial intelligence research due to their superior perf...
Autores principales: | , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
MDPI
2023
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10346710/ https://www.ncbi.nlm.nih.gov/pubmed/37447628 http://dx.doi.org/10.3390/s23135778 |
_version_ | 1785073377667448832 |
---|---|
author | Sharifi-Renani, Mohsen Mahoor, Mohammad H. Clary, Chadd W. |
author_facet | Sharifi-Renani, Mohsen Mahoor, Mohammad H. Clary, Chadd W. |
author_sort | Sharifi-Renani, Mohsen |
collection | PubMed |
description | Through wearable sensors and deep learning techniques, biomechanical analysis can reach beyond the lab for clinical and sporting applications. Transformers, a class of recent deep learning models, have become widely used in state-of-the-art artificial intelligence research due to their superior performance in various natural language processing and computer vision tasks. The performance of transformer models has not yet been investigated in biomechanics applications. In this study, we introduce a Biomechanical Multi-activity Transformer-based model, BioMAT, for the estimation of joint kinematics from streaming signals of multiple inertia measurement units (IMUs) using a publicly available dataset. This dataset includes IMU signals and the corresponding sagittal plane kinematics of the hip, knee, and ankle joints during multiple activities of daily living. We evaluated the model’s performance and generalizability and compared it against a convolutional neural network long short-term model, a bidirectional long short-term model, and multi-linear regression across different ambulation tasks including level ground walking (LW), ramp ascent (RA), ramp descent (RD), stair ascent (SA), and stair descent (SD). To investigate the effect of different activity datasets on prediction accuracy, we compared the performance of a universal model trained on all activities against task-specific models trained on individual tasks. When the models were tested on three unseen subjects’ data, BioMAT outperformed the benchmark models with an average root mean square error (RMSE) of 5.5 ± 0.5°, and normalized RMSE of 6.8 ± 0.3° across all three joints and all activities. A unified BioMAT model demonstrated superior performance compared to individual task-specific models across four of five activities. The RMSE values from the universal model for LW, RA, RD, SA, and SD activities were 5.0 ± 1.5°, 6.2 ± 1.1°, 5.8 ± 1.1°, 5.3 ± 1.6°, and 5.2 ± 0.7° while these values for task-specific models were, 5.3 ± 2.1°, 6.7 ± 2.0°, 6.9 ± 2.2°, 4.9 ± 1.4°, and 5.6 ± 1.3°, respectively. Overall, BioMAT accurately estimated joint kinematics relative to previous machine learning algorithms across different activities directly from the sequence of IMUs signals instead of time-normalized gait cycle data. |
format | Online Article Text |
id | pubmed-10346710 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2023 |
publisher | MDPI |
record_format | MEDLINE/PubMed |
spelling | pubmed-103467102023-07-15 BioMAT: An Open-Source Biomechanics Multi-Activity Transformer for Joint Kinematic Predictions Using Wearable Sensors Sharifi-Renani, Mohsen Mahoor, Mohammad H. Clary, Chadd W. Sensors (Basel) Article Through wearable sensors and deep learning techniques, biomechanical analysis can reach beyond the lab for clinical and sporting applications. Transformers, a class of recent deep learning models, have become widely used in state-of-the-art artificial intelligence research due to their superior performance in various natural language processing and computer vision tasks. The performance of transformer models has not yet been investigated in biomechanics applications. In this study, we introduce a Biomechanical Multi-activity Transformer-based model, BioMAT, for the estimation of joint kinematics from streaming signals of multiple inertia measurement units (IMUs) using a publicly available dataset. This dataset includes IMU signals and the corresponding sagittal plane kinematics of the hip, knee, and ankle joints during multiple activities of daily living. We evaluated the model’s performance and generalizability and compared it against a convolutional neural network long short-term model, a bidirectional long short-term model, and multi-linear regression across different ambulation tasks including level ground walking (LW), ramp ascent (RA), ramp descent (RD), stair ascent (SA), and stair descent (SD). To investigate the effect of different activity datasets on prediction accuracy, we compared the performance of a universal model trained on all activities against task-specific models trained on individual tasks. When the models were tested on three unseen subjects’ data, BioMAT outperformed the benchmark models with an average root mean square error (RMSE) of 5.5 ± 0.5°, and normalized RMSE of 6.8 ± 0.3° across all three joints and all activities. A unified BioMAT model demonstrated superior performance compared to individual task-specific models across four of five activities. The RMSE values from the universal model for LW, RA, RD, SA, and SD activities were 5.0 ± 1.5°, 6.2 ± 1.1°, 5.8 ± 1.1°, 5.3 ± 1.6°, and 5.2 ± 0.7° while these values for task-specific models were, 5.3 ± 2.1°, 6.7 ± 2.0°, 6.9 ± 2.2°, 4.9 ± 1.4°, and 5.6 ± 1.3°, respectively. Overall, BioMAT accurately estimated joint kinematics relative to previous machine learning algorithms across different activities directly from the sequence of IMUs signals instead of time-normalized gait cycle data. MDPI 2023-06-21 /pmc/articles/PMC10346710/ /pubmed/37447628 http://dx.doi.org/10.3390/s23135778 Text en © 2023 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). |
spellingShingle | Article Sharifi-Renani, Mohsen Mahoor, Mohammad H. Clary, Chadd W. BioMAT: An Open-Source Biomechanics Multi-Activity Transformer for Joint Kinematic Predictions Using Wearable Sensors |
title | BioMAT: An Open-Source Biomechanics Multi-Activity Transformer for Joint Kinematic Predictions Using Wearable Sensors |
title_full | BioMAT: An Open-Source Biomechanics Multi-Activity Transformer for Joint Kinematic Predictions Using Wearable Sensors |
title_fullStr | BioMAT: An Open-Source Biomechanics Multi-Activity Transformer for Joint Kinematic Predictions Using Wearable Sensors |
title_full_unstemmed | BioMAT: An Open-Source Biomechanics Multi-Activity Transformer for Joint Kinematic Predictions Using Wearable Sensors |
title_short | BioMAT: An Open-Source Biomechanics Multi-Activity Transformer for Joint Kinematic Predictions Using Wearable Sensors |
title_sort | biomat: an open-source biomechanics multi-activity transformer for joint kinematic predictions using wearable sensors |
topic | Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10346710/ https://www.ncbi.nlm.nih.gov/pubmed/37447628 http://dx.doi.org/10.3390/s23135778 |
work_keys_str_mv | AT sharifirenanimohsen biomatanopensourcebiomechanicsmultiactivitytransformerforjointkinematicpredictionsusingwearablesensors AT mahoormohammadh biomatanopensourcebiomechanicsmultiactivitytransformerforjointkinematicpredictionsusingwearablesensors AT clarychaddw biomatanopensourcebiomechanicsmultiactivitytransformerforjointkinematicpredictionsusingwearablesensors |