Cargando…

Evaluation of intra- and interobserver reliability of the AO classification for wrist fractures()

OBJECTIVE: This study evaluated the intraobserver and interobserver reliability of the AO classification for standard radiographs of wrist fractures. METHODS: Thirty observers, divided into three groups (orthopedic surgery senior residents, orthopedic surgeons, and hand surgeons) classified 52 wrist...

Descripción completa

Detalles Bibliográficos
Autores principales: Tenório, Pedro Henrique de Magalhães, Vieira, Marcelo Marques, Alberti, Abner, Abreu, Marcos Felipe Marcatto de, Nakamoto, João Carlos, Cliquet, Alberto
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Elsevier 2018
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6204541/
https://www.ncbi.nlm.nih.gov/pubmed/30377603
http://dx.doi.org/10.1016/j.rboe.2017.08.024
_version_ 1783366055883702272
author Tenório, Pedro Henrique de Magalhães
Vieira, Marcelo Marques
Alberti, Abner
Abreu, Marcos Felipe Marcatto de
Nakamoto, João Carlos
Cliquet, Alberto
author_facet Tenório, Pedro Henrique de Magalhães
Vieira, Marcelo Marques
Alberti, Abner
Abreu, Marcos Felipe Marcatto de
Nakamoto, João Carlos
Cliquet, Alberto
author_sort Tenório, Pedro Henrique de Magalhães
collection PubMed
description OBJECTIVE: This study evaluated the intraobserver and interobserver reliability of the AO classification for standard radiographs of wrist fractures. METHODS: Thirty observers, divided into three groups (orthopedic surgery senior residents, orthopedic surgeons, and hand surgeons) classified 52 wrist fractures, using only simple radiographs. After a period of four weeks, the same observers evaluated the initial 52 radiographs, in a randomized order. The agreement among the observers, the groups, and intraobserver was obtained using the Kappa index. Kappa-values were interpreted as proposed by Landis and Koch. RESULTS: The global interobserver agreement level of the AO classification was considered fair (0.30). The three groups presented fair global interobserver agreement (residents, 0.27; orthopedic surgeons, 0.30; hand surgeons, 0.33). The global intraobserver agreement level was moderated. The hand surgeon group obtained the higher intraobserver agreement level, although only moderate (0.50). The residents group obtained fair levels (0.30), as did the orthopedics surgeon group (0.33). CONCLUSION: The data obtained suggests fair levels of interobserver agreement and moderate levels of intraobserver agreement for the AO classification for wrist fractures.
format Online
Article
Text
id pubmed-6204541
institution National Center for Biotechnology Information
language English
publishDate 2018
publisher Elsevier
record_format MEDLINE/PubMed
spelling pubmed-62045412018-10-30 Evaluation of intra- and interobserver reliability of the AO classification for wrist fractures() Tenório, Pedro Henrique de Magalhães Vieira, Marcelo Marques Alberti, Abner Abreu, Marcos Felipe Marcatto de Nakamoto, João Carlos Cliquet, Alberto Rev Bras Ortop Original Article OBJECTIVE: This study evaluated the intraobserver and interobserver reliability of the AO classification for standard radiographs of wrist fractures. METHODS: Thirty observers, divided into three groups (orthopedic surgery senior residents, orthopedic surgeons, and hand surgeons) classified 52 wrist fractures, using only simple radiographs. After a period of four weeks, the same observers evaluated the initial 52 radiographs, in a randomized order. The agreement among the observers, the groups, and intraobserver was obtained using the Kappa index. Kappa-values were interpreted as proposed by Landis and Koch. RESULTS: The global interobserver agreement level of the AO classification was considered fair (0.30). The three groups presented fair global interobserver agreement (residents, 0.27; orthopedic surgeons, 0.30; hand surgeons, 0.33). The global intraobserver agreement level was moderated. The hand surgeon group obtained the higher intraobserver agreement level, although only moderate (0.50). The residents group obtained fair levels (0.30), as did the orthopedics surgeon group (0.33). CONCLUSION: The data obtained suggests fair levels of interobserver agreement and moderate levels of intraobserver agreement for the AO classification for wrist fractures. Elsevier 2018-10-12 /pmc/articles/PMC6204541/ /pubmed/30377603 http://dx.doi.org/10.1016/j.rboe.2017.08.024 Text en © 2017 Sociedade Brasileira de Ortopedia e Traumatologia. Published by Elsevier Editora Ltda. http://creativecommons.org/licenses/by-nc-nd/4.0/ This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/).
spellingShingle Original Article
Tenório, Pedro Henrique de Magalhães
Vieira, Marcelo Marques
Alberti, Abner
Abreu, Marcos Felipe Marcatto de
Nakamoto, João Carlos
Cliquet, Alberto
Evaluation of intra- and interobserver reliability of the AO classification for wrist fractures()
title Evaluation of intra- and interobserver reliability of the AO classification for wrist fractures()
title_full Evaluation of intra- and interobserver reliability of the AO classification for wrist fractures()
title_fullStr Evaluation of intra- and interobserver reliability of the AO classification for wrist fractures()
title_full_unstemmed Evaluation of intra- and interobserver reliability of the AO classification for wrist fractures()
title_short Evaluation of intra- and interobserver reliability of the AO classification for wrist fractures()
title_sort evaluation of intra- and interobserver reliability of the ao classification for wrist fractures()
topic Original Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6204541/
https://www.ncbi.nlm.nih.gov/pubmed/30377603
http://dx.doi.org/10.1016/j.rboe.2017.08.024
work_keys_str_mv AT tenoriopedrohenriquedemagalhaes evaluationofintraandinterobserverreliabilityoftheaoclassificationforwristfractures
AT vieiramarcelomarques evaluationofintraandinterobserverreliabilityoftheaoclassificationforwristfractures
AT albertiabner evaluationofintraandinterobserverreliabilityoftheaoclassificationforwristfractures
AT abreumarcosfelipemarcattode evaluationofintraandinterobserverreliabilityoftheaoclassificationforwristfractures
AT nakamotojoaocarlos evaluationofintraandinterobserverreliabilityoftheaoclassificationforwristfractures
AT cliquetalberto evaluationofintraandinterobserverreliabilityoftheaoclassificationforwristfractures