Cargando…

Reviewer agreement trends from four years of electronic submissions of conference abstract

BACKGROUND: The purpose of this study was to determine the inter-rater agreement between reviewers on the quality of abstract submissions to an annual national scientific meeting (Canadian Association of Emergency Physicians; CAEP) to identify factors associated with low agreement. METHODS: All abst...

Descripción completa

Detalles Bibliográficos
Autores principales: Rowe, Brian H, Strome, Trevor L, Spooner, Carol, Blitz, Sandra, Grafstein, Eric, Worster, Andrew
Formato: Texto
Lenguaje:English
Publicado: BioMed Central 2006
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC1473196/
https://www.ncbi.nlm.nih.gov/pubmed/16545143
http://dx.doi.org/10.1186/1471-2288-6-14
_version_ 1782127857404739584
author Rowe, Brian H
Strome, Trevor L
Spooner, Carol
Blitz, Sandra
Grafstein, Eric
Worster, Andrew
author_facet Rowe, Brian H
Strome, Trevor L
Spooner, Carol
Blitz, Sandra
Grafstein, Eric
Worster, Andrew
author_sort Rowe, Brian H
collection PubMed
description BACKGROUND: The purpose of this study was to determine the inter-rater agreement between reviewers on the quality of abstract submissions to an annual national scientific meeting (Canadian Association of Emergency Physicians; CAEP) to identify factors associated with low agreement. METHODS: All abstracts were submitted using an on-line system and assessed by three volunteer CAEP reviewers blinded to the abstracts' source. Reviewers used an on-line form specific for each type of study design to score abstracts based on nine criteria, each contributing from two to six points toward the total (maximum 24). The final score was determined to be the mean of the three reviewers' scores using Intraclass Correlation Coefficient (ICC). RESULTS: 495 Abstracts were received electronically during the four-year period, 2001 – 2004, increasing from 94 abstracts in 2001 to 165 in 2004. The mean score for submitted abstracts over the four years was 14.4 (95% CI: 14.1–14.6). While there was no significant difference between mean total scores over the four years (p = 0.23), the ICC increased from fair (0.36; 95% CI: 0.24–0.49) to moderate (0.59; 95% CI: 0.50–0.68). Reviewers agreed less on individual criteria than on the total score in general, and less on subjective than objective criteria. CONCLUSION: The correlation between reviewers' total scores suggests general recognition of "high quality" and "low quality" abstracts. Criteria based on the presence/absence of objective methodological parameters (i.e., blinding in a controlled clinical trial) resulted in higher inter-rater agreement than the more subjective and opinion-based criteria. In future abstract competitions, defining criteria more objectively so that reviewers can base their responses on empirical evidence may lead to increased consistency of scoring and, presumably, increased fairness to submitters.
format Text
id pubmed-1473196
institution National Center for Biotechnology Information
language English
publishDate 2006
publisher BioMed Central
record_format MEDLINE/PubMed
spelling pubmed-14731962006-06-03 Reviewer agreement trends from four years of electronic submissions of conference abstract Rowe, Brian H Strome, Trevor L Spooner, Carol Blitz, Sandra Grafstein, Eric Worster, Andrew BMC Med Res Methodol Research Article BACKGROUND: The purpose of this study was to determine the inter-rater agreement between reviewers on the quality of abstract submissions to an annual national scientific meeting (Canadian Association of Emergency Physicians; CAEP) to identify factors associated with low agreement. METHODS: All abstracts were submitted using an on-line system and assessed by three volunteer CAEP reviewers blinded to the abstracts' source. Reviewers used an on-line form specific for each type of study design to score abstracts based on nine criteria, each contributing from two to six points toward the total (maximum 24). The final score was determined to be the mean of the three reviewers' scores using Intraclass Correlation Coefficient (ICC). RESULTS: 495 Abstracts were received electronically during the four-year period, 2001 – 2004, increasing from 94 abstracts in 2001 to 165 in 2004. The mean score for submitted abstracts over the four years was 14.4 (95% CI: 14.1–14.6). While there was no significant difference between mean total scores over the four years (p = 0.23), the ICC increased from fair (0.36; 95% CI: 0.24–0.49) to moderate (0.59; 95% CI: 0.50–0.68). Reviewers agreed less on individual criteria than on the total score in general, and less on subjective than objective criteria. CONCLUSION: The correlation between reviewers' total scores suggests general recognition of "high quality" and "low quality" abstracts. Criteria based on the presence/absence of objective methodological parameters (i.e., blinding in a controlled clinical trial) resulted in higher inter-rater agreement than the more subjective and opinion-based criteria. In future abstract competitions, defining criteria more objectively so that reviewers can base their responses on empirical evidence may lead to increased consistency of scoring and, presumably, increased fairness to submitters. BioMed Central 2006-03-19 /pmc/articles/PMC1473196/ /pubmed/16545143 http://dx.doi.org/10.1186/1471-2288-6-14 Text en Copyright © 2006 Rowe et al; licensee BioMed Central Ltd. http://creativecommons.org/licenses/by/2.0 This is an Open Access article distributed under the terms of the Creative Commons Attribution License ( (http://creativecommons.org/licenses/by/2.0) ), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
spellingShingle Research Article
Rowe, Brian H
Strome, Trevor L
Spooner, Carol
Blitz, Sandra
Grafstein, Eric
Worster, Andrew
Reviewer agreement trends from four years of electronic submissions of conference abstract
title Reviewer agreement trends from four years of electronic submissions of conference abstract
title_full Reviewer agreement trends from four years of electronic submissions of conference abstract
title_fullStr Reviewer agreement trends from four years of electronic submissions of conference abstract
title_full_unstemmed Reviewer agreement trends from four years of electronic submissions of conference abstract
title_short Reviewer agreement trends from four years of electronic submissions of conference abstract
title_sort reviewer agreement trends from four years of electronic submissions of conference abstract
topic Research Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC1473196/
https://www.ncbi.nlm.nih.gov/pubmed/16545143
http://dx.doi.org/10.1186/1471-2288-6-14
work_keys_str_mv AT rowebrianh revieweragreementtrendsfromfouryearsofelectronicsubmissionsofconferenceabstract
AT strometrevorl revieweragreementtrendsfromfouryearsofelectronicsubmissionsofconferenceabstract
AT spoonercarol revieweragreementtrendsfromfouryearsofelectronicsubmissionsofconferenceabstract
AT blitzsandra revieweragreementtrendsfromfouryearsofelectronicsubmissionsofconferenceabstract
AT grafsteineric revieweragreementtrendsfromfouryearsofelectronicsubmissionsofconferenceabstract
AT worsterandrew revieweragreementtrendsfromfouryearsofelectronicsubmissionsofconferenceabstract