Cargando…
Inter- and intraobserver agreement of three classification systems for lateral clavicle fractures – reliability comparison between two specialist groups
BACKGROUND: Although of great value in the management of lateral clavicle fractures, substantial variation in their classification exists. We performed a retrospective study to address the inter- and intraobserver reliability of three different classification systems for lateral clavicle fractures....
Autores principales: | , , , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
BioMed Central
2020
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6945566/ https://www.ncbi.nlm.nih.gov/pubmed/31911821 http://dx.doi.org/10.1186/s13037-019-0228-y |
_version_ | 1783485204406468608 |
---|---|
author | Rauer, Thomas Boos, Matthias Neuhaus, Valentin Ellanti, Prasad Kaufmann, Robert Alexander Pape, Hans-Christoph Allemann, Florin |
author_facet | Rauer, Thomas Boos, Matthias Neuhaus, Valentin Ellanti, Prasad Kaufmann, Robert Alexander Pape, Hans-Christoph Allemann, Florin |
author_sort | Rauer, Thomas |
collection | PubMed |
description | BACKGROUND: Although of great value in the management of lateral clavicle fractures, substantial variation in their classification exists. We performed a retrospective study to address the inter- and intraobserver reliability of three different classification systems for lateral clavicle fractures. METHODS: Radiographs of 20 lateral clavicle fractures that represented a full spectrum of adult fracture patterns were graded by five experienced radiologists and five experienced trauma surgeons according to the Orthopaedic Trauma Association (OTA), the Neer, and the Jäger/Breitner classification systems. This evaluation was performed at two different time points separated by 3 months. To measure the observer agreement, the Fleiss kappa coefficient (κ) was applied and assessed according to the grading of Landis and Koch. RESULTS: The overall interobserver reliability showed a fair agreement in all three classification systems. For the OTA classification system, the interobserver agreement showed a mean kappa value of 0.338 ranging from 0.350 (radiologists) to 0.374 (trauma surgeons). Kappa values of the interobserver agreement for the Neer classification system ranged from 0.238 (trauma surgeons) to 0.276 (radiologists) with a mean κ of 0.278. The Jäger/Breitner classification system demonstrated a mean kappa value of 0.330 ranging from 0.306 (trauma surgeons) to 0.382 (radiologists). The overall intraobserver reliability was moderate for the OTA and the Jäger/Breitner classification systems, while the overall intraobserver reliability for the Neer classification system was fair. The kappa values of the intraobserver agreements showed, in all classification systems, a wide range with the OTA classification system ranging from 0.086 to 0.634, the Neer classification system ranging from 0.137 to 0.448, and a range from 0.154 to 0.625 of the Jäger/Breitner classification system. CONCLUSIONS: The low inter- and intraobserver agreement levels exhibited in all three classification systems by both specialist groups suggest that the tested lateral clavicle fracture classification systems are unreliable and, therefore, of limited value. We should recognize there is considerable inconsistency in how physicians classify lateral clavicle fractures and therefore any conclusions based on these classifications should be recognized as being somewhat subjective. |
format | Online Article Text |
id | pubmed-6945566 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2020 |
publisher | BioMed Central |
record_format | MEDLINE/PubMed |
spelling | pubmed-69455662020-01-07 Inter- and intraobserver agreement of three classification systems for lateral clavicle fractures – reliability comparison between two specialist groups Rauer, Thomas Boos, Matthias Neuhaus, Valentin Ellanti, Prasad Kaufmann, Robert Alexander Pape, Hans-Christoph Allemann, Florin Patient Saf Surg Research BACKGROUND: Although of great value in the management of lateral clavicle fractures, substantial variation in their classification exists. We performed a retrospective study to address the inter- and intraobserver reliability of three different classification systems for lateral clavicle fractures. METHODS: Radiographs of 20 lateral clavicle fractures that represented a full spectrum of adult fracture patterns were graded by five experienced radiologists and five experienced trauma surgeons according to the Orthopaedic Trauma Association (OTA), the Neer, and the Jäger/Breitner classification systems. This evaluation was performed at two different time points separated by 3 months. To measure the observer agreement, the Fleiss kappa coefficient (κ) was applied and assessed according to the grading of Landis and Koch. RESULTS: The overall interobserver reliability showed a fair agreement in all three classification systems. For the OTA classification system, the interobserver agreement showed a mean kappa value of 0.338 ranging from 0.350 (radiologists) to 0.374 (trauma surgeons). Kappa values of the interobserver agreement for the Neer classification system ranged from 0.238 (trauma surgeons) to 0.276 (radiologists) with a mean κ of 0.278. The Jäger/Breitner classification system demonstrated a mean kappa value of 0.330 ranging from 0.306 (trauma surgeons) to 0.382 (radiologists). The overall intraobserver reliability was moderate for the OTA and the Jäger/Breitner classification systems, while the overall intraobserver reliability for the Neer classification system was fair. The kappa values of the intraobserver agreements showed, in all classification systems, a wide range with the OTA classification system ranging from 0.086 to 0.634, the Neer classification system ranging from 0.137 to 0.448, and a range from 0.154 to 0.625 of the Jäger/Breitner classification system. CONCLUSIONS: The low inter- and intraobserver agreement levels exhibited in all three classification systems by both specialist groups suggest that the tested lateral clavicle fracture classification systems are unreliable and, therefore, of limited value. We should recognize there is considerable inconsistency in how physicians classify lateral clavicle fractures and therefore any conclusions based on these classifications should be recognized as being somewhat subjective. BioMed Central 2020-01-07 /pmc/articles/PMC6945566/ /pubmed/31911821 http://dx.doi.org/10.1186/s13037-019-0228-y Text en © The Author(s). 2020 Open AccessThis article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated. |
spellingShingle | Research Rauer, Thomas Boos, Matthias Neuhaus, Valentin Ellanti, Prasad Kaufmann, Robert Alexander Pape, Hans-Christoph Allemann, Florin Inter- and intraobserver agreement of three classification systems for lateral clavicle fractures – reliability comparison between two specialist groups |
title | Inter- and intraobserver agreement of three classification systems for lateral clavicle fractures – reliability comparison between two specialist groups |
title_full | Inter- and intraobserver agreement of three classification systems for lateral clavicle fractures – reliability comparison between two specialist groups |
title_fullStr | Inter- and intraobserver agreement of three classification systems for lateral clavicle fractures – reliability comparison between two specialist groups |
title_full_unstemmed | Inter- and intraobserver agreement of three classification systems for lateral clavicle fractures – reliability comparison between two specialist groups |
title_short | Inter- and intraobserver agreement of three classification systems for lateral clavicle fractures – reliability comparison between two specialist groups |
title_sort | inter- and intraobserver agreement of three classification systems for lateral clavicle fractures – reliability comparison between two specialist groups |
topic | Research |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6945566/ https://www.ncbi.nlm.nih.gov/pubmed/31911821 http://dx.doi.org/10.1186/s13037-019-0228-y |
work_keys_str_mv | AT rauerthomas interandintraobserveragreementofthreeclassificationsystemsforlateralclaviclefracturesreliabilitycomparisonbetweentwospecialistgroups AT boosmatthias interandintraobserveragreementofthreeclassificationsystemsforlateralclaviclefracturesreliabilitycomparisonbetweentwospecialistgroups AT neuhausvalentin interandintraobserveragreementofthreeclassificationsystemsforlateralclaviclefracturesreliabilitycomparisonbetweentwospecialistgroups AT ellantiprasad interandintraobserveragreementofthreeclassificationsystemsforlateralclaviclefracturesreliabilitycomparisonbetweentwospecialistgroups AT kaufmannrobertalexander interandintraobserveragreementofthreeclassificationsystemsforlateralclaviclefracturesreliabilitycomparisonbetweentwospecialistgroups AT papehanschristoph interandintraobserveragreementofthreeclassificationsystemsforlateralclaviclefracturesreliabilitycomparisonbetweentwospecialistgroups AT allemannflorin interandintraobserveragreementofthreeclassificationsystemsforlateralclaviclefracturesreliabilitycomparisonbetweentwospecialistgroups |