Cargando…
Poster 121: Inter- and Intra-Observer Agreement are Unsatisfactory when Determining Study Design and Level of Evidence
OBJECTIVES: Understanding differences between types of study design (SD) and level of evidence (LOE) are important when selecting research for presentation or publication and determining its potential clinical impact. The purpose of this study was to evaluate inter- and intra-observer reliability wh...
Autores principales: | , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
SAGE Publications
2022
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9344158/ http://dx.doi.org/10.1177/2325967121S00682 |
_version_ | 1784761157643403264 |
---|---|
author | Schmitz, Matthew McKay, Scott Patel, Neeraj |
author_facet | Schmitz, Matthew McKay, Scott Patel, Neeraj |
author_sort | Schmitz, Matthew |
collection | PubMed |
description | OBJECTIVES: Understanding differences between types of study design (SD) and level of evidence (LOE) are important when selecting research for presentation or publication and determining its potential clinical impact. The purpose of this study was to evaluate inter- and intra-observer reliability when assigning LOE and SD as well as quantify the impact of a commonly used reference aid on these assessments. METHODS: Thirty-six accepted abstracts from the Pediatric Orthopaedic Society of North America (POSNA) 2021 annual meeting were selected for this study. Thirteen reviewers from the POSNA Evidence Based Practice Committee were asked to determine LOE and SD for each abstract, first without any assistance or resources. Four weeks later, abstracts were reviewed again with the guidance of the Journal of Bone and Joint Surgery (JBJS) LOE chart, which is adapted from the Oxford Centre for Evidence-Based Medicine. Inter- and intra-observer reliability were calculated using Fleiss’ kappa statistic (k). Chi-square analysis was used to compare the rate of SD-LOE mismatch between the first and second round of reviews. RESULTS: Inter-observer reliability for LOE improved slightly from fair (k=0.28) to moderate (k=0.43) with use of the JBJS chart. There was better agreement with increasing LOE, with the most frequent disagreement between levels 3 and 4. Inter-observer reliability for SD was fair for both rounds 1 (k=0.29) and 2 (k=0.37). Similar to LOE, there was better agreement with stronger study design. The most common disagreements were between retrospective cohort, retrospective case-control, and case series. Intra-observer reliability was widely variable for both LOE and SD (k=0.10 to 0.92 for both). When matching a selected SD to its associated LOE, the overall rate of correct concordance was 82% in round 1 and 92% in round 2 (p<0.001). Six of the 13 respondents improved in round 2, and 3 were 100% correct both times. CONCLUSIONS: Inter-observer reliability for LOE and SD was fair to moderate at best, even among experienced reviewers. Use of the JBJS/Oxford chart mildly improved agreement on LOE and resulted in less SD-LOE mismatch, but did not affect agreement on SD. Professional societies and journals may consider requiring more specific information on SD from authors, while authors and reviewers may benefit from improved instruments to guide SD and LOE designation. |
format | Online Article Text |
id | pubmed-9344158 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2022 |
publisher | SAGE Publications |
record_format | MEDLINE/PubMed |
spelling | pubmed-93441582022-08-03 Poster 121: Inter- and Intra-Observer Agreement are Unsatisfactory when Determining Study Design and Level of Evidence Schmitz, Matthew McKay, Scott Patel, Neeraj Orthop J Sports Med Article OBJECTIVES: Understanding differences between types of study design (SD) and level of evidence (LOE) are important when selecting research for presentation or publication and determining its potential clinical impact. The purpose of this study was to evaluate inter- and intra-observer reliability when assigning LOE and SD as well as quantify the impact of a commonly used reference aid on these assessments. METHODS: Thirty-six accepted abstracts from the Pediatric Orthopaedic Society of North America (POSNA) 2021 annual meeting were selected for this study. Thirteen reviewers from the POSNA Evidence Based Practice Committee were asked to determine LOE and SD for each abstract, first without any assistance or resources. Four weeks later, abstracts were reviewed again with the guidance of the Journal of Bone and Joint Surgery (JBJS) LOE chart, which is adapted from the Oxford Centre for Evidence-Based Medicine. Inter- and intra-observer reliability were calculated using Fleiss’ kappa statistic (k). Chi-square analysis was used to compare the rate of SD-LOE mismatch between the first and second round of reviews. RESULTS: Inter-observer reliability for LOE improved slightly from fair (k=0.28) to moderate (k=0.43) with use of the JBJS chart. There was better agreement with increasing LOE, with the most frequent disagreement between levels 3 and 4. Inter-observer reliability for SD was fair for both rounds 1 (k=0.29) and 2 (k=0.37). Similar to LOE, there was better agreement with stronger study design. The most common disagreements were between retrospective cohort, retrospective case-control, and case series. Intra-observer reliability was widely variable for both LOE and SD (k=0.10 to 0.92 for both). When matching a selected SD to its associated LOE, the overall rate of correct concordance was 82% in round 1 and 92% in round 2 (p<0.001). Six of the 13 respondents improved in round 2, and 3 were 100% correct both times. CONCLUSIONS: Inter-observer reliability for LOE and SD was fair to moderate at best, even among experienced reviewers. Use of the JBJS/Oxford chart mildly improved agreement on LOE and resulted in less SD-LOE mismatch, but did not affect agreement on SD. Professional societies and journals may consider requiring more specific information on SD from authors, while authors and reviewers may benefit from improved instruments to guide SD and LOE designation. SAGE Publications 2022-07-28 /pmc/articles/PMC9344158/ http://dx.doi.org/10.1177/2325967121S00682 Text en © The Author(s) 2022 https://creativecommons.org/licenses/by-nc-nd/4.0/This open-access article is published and distributed under the Creative Commons Attribution - NonCommercial - No Derivatives License (https://creativecommons.org/licenses/by-nc-nd/4.0/), which permits the noncommercial use, distribution, and reproduction of the article in any medium, provided the original author and source are credited. You may not alter, transform, or build upon this article without the permission of the Author(s). For article reuse guidelines, please visit SAGE’s website at http://www.sagepub.com/journals-permissions. |
spellingShingle | Article Schmitz, Matthew McKay, Scott Patel, Neeraj Poster 121: Inter- and Intra-Observer Agreement are Unsatisfactory when Determining Study Design and Level of Evidence |
title | Poster 121: Inter- and Intra-Observer Agreement are Unsatisfactory when
Determining Study Design and Level of Evidence |
title_full | Poster 121: Inter- and Intra-Observer Agreement are Unsatisfactory when
Determining Study Design and Level of Evidence |
title_fullStr | Poster 121: Inter- and Intra-Observer Agreement are Unsatisfactory when
Determining Study Design and Level of Evidence |
title_full_unstemmed | Poster 121: Inter- and Intra-Observer Agreement are Unsatisfactory when
Determining Study Design and Level of Evidence |
title_short | Poster 121: Inter- and Intra-Observer Agreement are Unsatisfactory when
Determining Study Design and Level of Evidence |
title_sort | poster 121: inter- and intra-observer agreement are unsatisfactory when
determining study design and level of evidence |
topic | Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9344158/ http://dx.doi.org/10.1177/2325967121S00682 |
work_keys_str_mv | AT schmitzmatthew poster121interandintraobserveragreementareunsatisfactorywhendeterminingstudydesignandlevelofevidence AT mckayscott poster121interandintraobserveragreementareunsatisfactorywhendeterminingstudydesignandlevelofevidence AT patelneeraj poster121interandintraobserveragreementareunsatisfactorywhendeterminingstudydesignandlevelofevidence |