Cargando…

Evaluation of the Interobserver Agreement of the Fraser and Blake & McBryde Classifications for Floating Knee

Objective  To evaluate the interobserver agreement of two classifications for floating knee: Fraser and Blake & McBryde. Method  Thirty-two observers, subdivided according to the degree of titration (26 resident physicians and 6 orthopedic physicians specialized in orthopedic trauma), classified...

Descripción completa

Detalles Bibliográficos
Autores principales: Alencar Neto, Jonatas Brito, Osório Neto, Ernane Bruno, Souza, Clodoaldo José Duarte de, da Rocha, Pedro Henrique Messias, Cavalcante, Maria Luzete Costa, Lopes, Márcio Bezerra Gadelha
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Thieme Revinter Publicações Ltda. 2021
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8405262/
https://www.ncbi.nlm.nih.gov/pubmed/34483389
http://dx.doi.org/10.1055/s-0040-1713388
_version_ 1783746294955638784
author Alencar Neto, Jonatas Brito
Osório Neto, Ernane Bruno
Souza, Clodoaldo José Duarte de
da Rocha, Pedro Henrique Messias
Cavalcante, Maria Luzete Costa
Lopes, Márcio Bezerra Gadelha
author_facet Alencar Neto, Jonatas Brito
Osório Neto, Ernane Bruno
Souza, Clodoaldo José Duarte de
da Rocha, Pedro Henrique Messias
Cavalcante, Maria Luzete Costa
Lopes, Márcio Bezerra Gadelha
author_sort Alencar Neto, Jonatas Brito
collection PubMed
description Objective  To evaluate the interobserver agreement of two classifications for floating knee: Fraser and Blake & McBryde. Method  Thirty-two observers, subdivided according to the degree of titration (26 resident physicians and 6 orthopedic physicians specialized in orthopedic trauma), classified 15 fractures of the ipsilateral femur and tibia. Interobserver agreement was evaluated by using the Kappa coefficient . Result  When evaluating the agreement between the 9 R1, a Kappa index of 0.58 was obtained for the Fraser classification and of 0.46 for the Blake & McBryde classification. Among the 7 R2, a rate of 0.59 was obtained for the Fraser rating and 0.51 for the Blake & McBryde rating. Among the 10 R3, the agreement index was higher for both classifications: 0.72 for the Fraser and 0.71 for the Blake & McBryde classification. Considering the 3 groups (R1, R2, R3) as one large group, the general Kappa index was calculated, which resulted in 0.63 for the Fraser classification and 0.56 for the Blake & McBryde classification. In the group of trauma and orthopedic knee specialists, in turn, an agreement of 0.597 was obtained for the Blake and McBryde classification and of 0.843 for the Fraser classification. Conclusion  Comparatively, the two classifications presented a weak to moderate degree of agreement. Fraser classification had better agreement in both groups. The agreement was higher when evaluating orthopedic trauma physicians.
format Online
Article
Text
id pubmed-8405262
institution National Center for Biotechnology Information
language English
publishDate 2021
publisher Thieme Revinter Publicações Ltda.
record_format MEDLINE/PubMed
spelling pubmed-84052622021-09-03 Evaluation of the Interobserver Agreement of the Fraser and Blake & McBryde Classifications for Floating Knee Alencar Neto, Jonatas Brito Osório Neto, Ernane Bruno Souza, Clodoaldo José Duarte de da Rocha, Pedro Henrique Messias Cavalcante, Maria Luzete Costa Lopes, Márcio Bezerra Gadelha Rev Bras Ortop (Sao Paulo) Objective  To evaluate the interobserver agreement of two classifications for floating knee: Fraser and Blake & McBryde. Method  Thirty-two observers, subdivided according to the degree of titration (26 resident physicians and 6 orthopedic physicians specialized in orthopedic trauma), classified 15 fractures of the ipsilateral femur and tibia. Interobserver agreement was evaluated by using the Kappa coefficient . Result  When evaluating the agreement between the 9 R1, a Kappa index of 0.58 was obtained for the Fraser classification and of 0.46 for the Blake & McBryde classification. Among the 7 R2, a rate of 0.59 was obtained for the Fraser rating and 0.51 for the Blake & McBryde rating. Among the 10 R3, the agreement index was higher for both classifications: 0.72 for the Fraser and 0.71 for the Blake & McBryde classification. Considering the 3 groups (R1, R2, R3) as one large group, the general Kappa index was calculated, which resulted in 0.63 for the Fraser classification and 0.56 for the Blake & McBryde classification. In the group of trauma and orthopedic knee specialists, in turn, an agreement of 0.597 was obtained for the Blake and McBryde classification and of 0.843 for the Fraser classification. Conclusion  Comparatively, the two classifications presented a weak to moderate degree of agreement. Fraser classification had better agreement in both groups. The agreement was higher when evaluating orthopedic trauma physicians. Thieme Revinter Publicações Ltda. 2021-08 2020-10-02 /pmc/articles/PMC8405262/ /pubmed/34483389 http://dx.doi.org/10.1055/s-0040-1713388 Text en Sociedade Brasileira de Ortopedia e Traumatologia. This is an open access article published by Thieme under the terms of the Creative Commons Attribution-NonDerivative-NonCommercial License, permitting copying and reproduction so long as the original work is given appropriate credit. Contents may not be used for commecial purposes, or adapted, remixed, transformed or built upon. ( https://creativecommons.org/licenses/by-nc-nd/4.0/ ) https://creativecommons.org/licenses/by-nc-nd/4.0/This is an open-access article distributed under the terms of the Creative Commons Attribution-NonCommercial-NoDerivatives License, which permits unrestricted reproduction and distribution, for non-commercial purposes only; and use and reproduction, but not distribution, of adapted material for non-commercial purposes only, provided the original work is properly cited.
spellingShingle Alencar Neto, Jonatas Brito
Osório Neto, Ernane Bruno
Souza, Clodoaldo José Duarte de
da Rocha, Pedro Henrique Messias
Cavalcante, Maria Luzete Costa
Lopes, Márcio Bezerra Gadelha
Evaluation of the Interobserver Agreement of the Fraser and Blake & McBryde Classifications for Floating Knee
title Evaluation of the Interobserver Agreement of the Fraser and Blake & McBryde Classifications for Floating Knee
title_full Evaluation of the Interobserver Agreement of the Fraser and Blake & McBryde Classifications for Floating Knee
title_fullStr Evaluation of the Interobserver Agreement of the Fraser and Blake & McBryde Classifications for Floating Knee
title_full_unstemmed Evaluation of the Interobserver Agreement of the Fraser and Blake & McBryde Classifications for Floating Knee
title_short Evaluation of the Interobserver Agreement of the Fraser and Blake & McBryde Classifications for Floating Knee
title_sort evaluation of the interobserver agreement of the fraser and blake & mcbryde classifications for floating knee
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8405262/
https://www.ncbi.nlm.nih.gov/pubmed/34483389
http://dx.doi.org/10.1055/s-0040-1713388
work_keys_str_mv AT alencarnetojonatasbrito evaluationoftheinterobserveragreementofthefraserandblakemcbrydeclassificationsforfloatingknee
AT osorionetoernanebruno evaluationoftheinterobserveragreementofthefraserandblakemcbrydeclassificationsforfloatingknee
AT souzaclodoaldojoseduartede evaluationoftheinterobserveragreementofthefraserandblakemcbrydeclassificationsforfloatingknee
AT darochapedrohenriquemessias evaluationoftheinterobserveragreementofthefraserandblakemcbrydeclassificationsforfloatingknee
AT cavalcantemarialuzetecosta evaluationoftheinterobserveragreementofthefraserandblakemcbrydeclassificationsforfloatingknee
AT lopesmarciobezerragadelha evaluationoftheinterobserveragreementofthefraserandblakemcbrydeclassificationsforfloatingknee