Cargando…

Movement Competency Screens Can Be Reliable In Clinical Practice By A Single Rater Using The Composite Score

BACKGROUND: Movement competency screens (MCSs) are commonly used by coaches and clinicians to assess injury risk. However, there is conflicting evidence regarding MCS reliability. PURPOSE: This study aimed to: (i) determine the inter- and intra-rater reliability of a sport specific field-based MCS i...

Descripción completa

Detalles Bibliográficos
Autores principales: Mann, Kerry J., O’Dwyer, Nicholas, Bruton, Michaela R., Bird, Stephen P., Edwards, Suzi
Formato: Online Artículo Texto
Lenguaje:English
Publicado: NASMI 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9159707/
https://www.ncbi.nlm.nih.gov/pubmed/35693862
_version_ 1784719110636044288
author Mann, Kerry J.
O’Dwyer, Nicholas
Bruton, Michaela R.
Bird, Stephen P.
Edwards, Suzi
author_facet Mann, Kerry J.
O’Dwyer, Nicholas
Bruton, Michaela R.
Bird, Stephen P.
Edwards, Suzi
author_sort Mann, Kerry J.
collection PubMed
description BACKGROUND: Movement competency screens (MCSs) are commonly used by coaches and clinicians to assess injury risk. However, there is conflicting evidence regarding MCS reliability. PURPOSE: This study aimed to: (i) determine the inter- and intra-rater reliability of a sport specific field-based MCS in novice and expert raters using different viewing methods (single and multiple views); and (ii) ascertain whether there were familiarization effects from repeated exposure for either raters or participants. STUDY DESIGN: Descriptive laboratory study METHODS: Pre-elite youth athletes (n=51) were recruited and videotaped while performing a MCS comprising nine dynamic movements in three separate trials. Performances were rated three times with a minimal four-week wash out between testing sessions, each in randomized order by 12 raters (3 expert, 9 novice), using a three-point scale. Kappa score, percentage agreement and intra-class correlation were calculated for each movement individually and for the composite score. RESULTS: Fifty-one pre-elite youth athletes (15.0±1.6 years; n=33 athletics, n=10 BMX and n=8 surfing) were included in the study. Based on kappa score and percentage agreement, both inter- and intra-rater reliability were highly variable for individual movements but consistently high (>0.70) for the MCS composite score. The composite score did not increase with task familiarization by the athletes. Experts detected more movement errors than novices and both rating groups improved their detection of errors with repeated viewings of the same movement. CONCLUSIONS: Irrespective of experience, raters demonstrated high variability in rating single movements, yet preliminary evidence suggests the MCS composite score could reliably assess movement competency. While athletes did not display a familiarization effect after performing the novel tasks within the MCS for the first time, raters showed improved error detection on repeated viewing of the same movement. LEVEL OF EVIDENCE: Cohort study
format Online
Article
Text
id pubmed-9159707
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher NASMI
record_format MEDLINE/PubMed
spelling pubmed-91597072022-06-09 Movement Competency Screens Can Be Reliable In Clinical Practice By A Single Rater Using The Composite Score Mann, Kerry J. O’Dwyer, Nicholas Bruton, Michaela R. Bird, Stephen P. Edwards, Suzi Int J Sports Phys Ther Original Research BACKGROUND: Movement competency screens (MCSs) are commonly used by coaches and clinicians to assess injury risk. However, there is conflicting evidence regarding MCS reliability. PURPOSE: This study aimed to: (i) determine the inter- and intra-rater reliability of a sport specific field-based MCS in novice and expert raters using different viewing methods (single and multiple views); and (ii) ascertain whether there were familiarization effects from repeated exposure for either raters or participants. STUDY DESIGN: Descriptive laboratory study METHODS: Pre-elite youth athletes (n=51) were recruited and videotaped while performing a MCS comprising nine dynamic movements in three separate trials. Performances were rated three times with a minimal four-week wash out between testing sessions, each in randomized order by 12 raters (3 expert, 9 novice), using a three-point scale. Kappa score, percentage agreement and intra-class correlation were calculated for each movement individually and for the composite score. RESULTS: Fifty-one pre-elite youth athletes (15.0±1.6 years; n=33 athletics, n=10 BMX and n=8 surfing) were included in the study. Based on kappa score and percentage agreement, both inter- and intra-rater reliability were highly variable for individual movements but consistently high (>0.70) for the MCS composite score. The composite score did not increase with task familiarization by the athletes. Experts detected more movement errors than novices and both rating groups improved their detection of errors with repeated viewings of the same movement. CONCLUSIONS: Irrespective of experience, raters demonstrated high variability in rating single movements, yet preliminary evidence suggests the MCS composite score could reliably assess movement competency. While athletes did not display a familiarization effect after performing the novel tasks within the MCS for the first time, raters showed improved error detection on repeated viewing of the same movement. LEVEL OF EVIDENCE: Cohort study NASMI 2022-06-01 /pmc/articles/PMC9159707/ /pubmed/35693862 Text en https://creativecommons.org/licenses/by-nc/4.0/ This is an open access article distributed under the terms of the Creative Commons Attribution-NonCommercial License (4.0) (https://creativecommons.org/licenses/by-nc/4.0/) which permits non-commercial use, distribution, and reproduction in any medium, provided the original author and source are credited.
spellingShingle Original Research
Mann, Kerry J.
O’Dwyer, Nicholas
Bruton, Michaela R.
Bird, Stephen P.
Edwards, Suzi
Movement Competency Screens Can Be Reliable In Clinical Practice By A Single Rater Using The Composite Score
title Movement Competency Screens Can Be Reliable In Clinical Practice By A Single Rater Using The Composite Score
title_full Movement Competency Screens Can Be Reliable In Clinical Practice By A Single Rater Using The Composite Score
title_fullStr Movement Competency Screens Can Be Reliable In Clinical Practice By A Single Rater Using The Composite Score
title_full_unstemmed Movement Competency Screens Can Be Reliable In Clinical Practice By A Single Rater Using The Composite Score
title_short Movement Competency Screens Can Be Reliable In Clinical Practice By A Single Rater Using The Composite Score
title_sort movement competency screens can be reliable in clinical practice by a single rater using the composite score
topic Original Research
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9159707/
https://www.ncbi.nlm.nih.gov/pubmed/35693862
work_keys_str_mv AT mannkerryj movementcompetencyscreenscanbereliableinclinicalpracticebyasingleraterusingthecompositescore
AT odwyernicholas movementcompetencyscreenscanbereliableinclinicalpracticebyasingleraterusingthecompositescore
AT brutonmichaelar movementcompetencyscreenscanbereliableinclinicalpracticebyasingleraterusingthecompositescore
AT birdstephenp movementcompetencyscreenscanbereliableinclinicalpracticebyasingleraterusingthecompositescore
AT edwardssuzi movementcompetencyscreenscanbereliableinclinicalpracticebyasingleraterusingthecompositescore