Cargando…

Examining intra-rater and inter-rater response agreement: A medical chart abstraction study of a community-based asthma care program

BACKGROUND: To assess the intra- and inter-rater agreement of chart abstractors from multiple sites involved in the evaluation of an Asthma Care Program (ACP). METHODS: For intra-rater agreement, 110 charts randomly selected from 1,433 patients enrolled in the ACP across eight Ontario communities we...

Descripción completa

Detalles Bibliográficos
Autores principales: To, Teresa, Estrabillo, Eileen, Wang, Chengning, Cicutto, Lisa
Formato: Texto
Lenguaje:English
Publicado: BioMed Central 2008
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2396663/
https://www.ncbi.nlm.nih.gov/pubmed/18471298
http://dx.doi.org/10.1186/1471-2288-8-29
_version_ 1782155588081287168
author To, Teresa
Estrabillo, Eileen
Wang, Chengning
Cicutto, Lisa
author_facet To, Teresa
Estrabillo, Eileen
Wang, Chengning
Cicutto, Lisa
author_sort To, Teresa
collection PubMed
description BACKGROUND: To assess the intra- and inter-rater agreement of chart abstractors from multiple sites involved in the evaluation of an Asthma Care Program (ACP). METHODS: For intra-rater agreement, 110 charts randomly selected from 1,433 patients enrolled in the ACP across eight Ontario communities were re-abstracted by 10 abstractors. For inter-rater agreement, data abstractors reviewed a set of eight fictitious charts. Data abstraction involved information pertaining to six categories: physical assessment, asthma control, spirometry, asthma education, referral visits, and medication side effects. Percentage agreement and the kappa statistic (κ) were used to measure agreement. Sensitivity and specificity estimates were calculated comparing results from all raters against the gold standard. RESULTS: Intra-rater re-abstraction yielded an overall kappa of 0.81. Kappa values for the chart abstraction categories were: physical assessment (κ 0.84), asthma control (κ 0.83), spirometry (κ 0.84), asthma education (κ 0.72), referral visits (κ 0.59) and medication side effects (κ 0.51). Inter-rater abstraction of the fictitious charts produced an overall kappa of 0.75, sensitivity of 0.91 and specificity of 0.89. Abstractors demonstrated agreement for physical assessment (κ 0.88, sensitivity and specificity 0.95), asthma control (κ 0.68, sensitivity 0.89, specificity 0.85), referral visits (κ 0.77, sensitivity 0.88, specificity 0.95), and asthma education (κ 0.49, sensitivity 0.87, specificity 0.77). CONCLUSION: Though collected by multiple abstractors, the results show high sensitivity and specificity and substantial to excellent inter- and intra-rater agreement, assuring confidence in the use of chart abstraction for evaluating the ACP.
format Text
id pubmed-2396663
institution National Center for Biotechnology Information
language English
publishDate 2008
publisher BioMed Central
record_format MEDLINE/PubMed
spelling pubmed-23966632008-05-28 Examining intra-rater and inter-rater response agreement: A medical chart abstraction study of a community-based asthma care program To, Teresa Estrabillo, Eileen Wang, Chengning Cicutto, Lisa BMC Med Res Methodol Research Article BACKGROUND: To assess the intra- and inter-rater agreement of chart abstractors from multiple sites involved in the evaluation of an Asthma Care Program (ACP). METHODS: For intra-rater agreement, 110 charts randomly selected from 1,433 patients enrolled in the ACP across eight Ontario communities were re-abstracted by 10 abstractors. For inter-rater agreement, data abstractors reviewed a set of eight fictitious charts. Data abstraction involved information pertaining to six categories: physical assessment, asthma control, spirometry, asthma education, referral visits, and medication side effects. Percentage agreement and the kappa statistic (κ) were used to measure agreement. Sensitivity and specificity estimates were calculated comparing results from all raters against the gold standard. RESULTS: Intra-rater re-abstraction yielded an overall kappa of 0.81. Kappa values for the chart abstraction categories were: physical assessment (κ 0.84), asthma control (κ 0.83), spirometry (κ 0.84), asthma education (κ 0.72), referral visits (κ 0.59) and medication side effects (κ 0.51). Inter-rater abstraction of the fictitious charts produced an overall kappa of 0.75, sensitivity of 0.91 and specificity of 0.89. Abstractors demonstrated agreement for physical assessment (κ 0.88, sensitivity and specificity 0.95), asthma control (κ 0.68, sensitivity 0.89, specificity 0.85), referral visits (κ 0.77, sensitivity 0.88, specificity 0.95), and asthma education (κ 0.49, sensitivity 0.87, specificity 0.77). CONCLUSION: Though collected by multiple abstractors, the results show high sensitivity and specificity and substantial to excellent inter- and intra-rater agreement, assuring confidence in the use of chart abstraction for evaluating the ACP. BioMed Central 2008-05-09 /pmc/articles/PMC2396663/ /pubmed/18471298 http://dx.doi.org/10.1186/1471-2288-8-29 Text en Copyright © 2008 To et al; licensee BioMed Central Ltd. http://creativecommons.org/licenses/by/2.0 This is an Open Access article distributed under the terms of the Creative Commons Attribution License ( (http://creativecommons.org/licenses/by/2.0) ), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
spellingShingle Research Article
To, Teresa
Estrabillo, Eileen
Wang, Chengning
Cicutto, Lisa
Examining intra-rater and inter-rater response agreement: A medical chart abstraction study of a community-based asthma care program
title Examining intra-rater and inter-rater response agreement: A medical chart abstraction study of a community-based asthma care program
title_full Examining intra-rater and inter-rater response agreement: A medical chart abstraction study of a community-based asthma care program
title_fullStr Examining intra-rater and inter-rater response agreement: A medical chart abstraction study of a community-based asthma care program
title_full_unstemmed Examining intra-rater and inter-rater response agreement: A medical chart abstraction study of a community-based asthma care program
title_short Examining intra-rater and inter-rater response agreement: A medical chart abstraction study of a community-based asthma care program
title_sort examining intra-rater and inter-rater response agreement: a medical chart abstraction study of a community-based asthma care program
topic Research Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2396663/
https://www.ncbi.nlm.nih.gov/pubmed/18471298
http://dx.doi.org/10.1186/1471-2288-8-29
work_keys_str_mv AT toteresa examiningintraraterandinterraterresponseagreementamedicalchartabstractionstudyofacommunitybasedasthmacareprogram
AT estrabilloeileen examiningintraraterandinterraterresponseagreementamedicalchartabstractionstudyofacommunitybasedasthmacareprogram
AT wangchengning examiningintraraterandinterraterresponseagreementamedicalchartabstractionstudyofacommunitybasedasthmacareprogram
AT cicuttolisa examiningintraraterandinterraterresponseagreementamedicalchartabstractionstudyofacommunitybasedasthmacareprogram