Cargando…

Intra-Rater and Inter-Rater Reliability of a Medical Record Abstraction Study on Transition of Care after Childhood Cancer

BACKGROUND: The abstraction of data from medical records is a widespread practice in epidemiological research. However, studies using this means of data collection rarely report reliability. Within the Transition after Childhood Cancer Study (TaCC) which is based on a medical record abstraction, we...

Descripción completa

Detalles Bibliográficos
Autores principales: Gianinazzi, Micòl E., Rueegg, Corina S., Zimmerman, Karin, Kuehni, Claudia E., Michel, Gisela
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Public Library of Science 2015
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4441480/
https://www.ncbi.nlm.nih.gov/pubmed/26001046
http://dx.doi.org/10.1371/journal.pone.0124290
_version_ 1782372800481198080
author Gianinazzi, Micòl E.
Rueegg, Corina S.
Zimmerman, Karin
Kuehni, Claudia E.
Michel, Gisela
author_facet Gianinazzi, Micòl E.
Rueegg, Corina S.
Zimmerman, Karin
Kuehni, Claudia E.
Michel, Gisela
author_sort Gianinazzi, Micòl E.
collection PubMed
description BACKGROUND: The abstraction of data from medical records is a widespread practice in epidemiological research. However, studies using this means of data collection rarely report reliability. Within the Transition after Childhood Cancer Study (TaCC) which is based on a medical record abstraction, we conducted a second independent abstraction of data with the aim to assess a) intra-rater reliability of one rater at two time points; b) the possible learning effects between these two time points compared to a gold-standard; and c) inter-rater reliability. METHOD: Within the TaCC study we conducted a systematic medical record abstraction in the 9 Swiss clinics with pediatric oncology wards. In a second phase we selected a subsample of medical records in 3 clinics to conduct a second independent abstraction. We then assessed intra-rater reliability at two time points, the learning effect over time (comparing each rater at two time-points with a gold-standard) and the inter-rater reliability of a selected number of variables. We calculated percentage agreement and Cohen’s kappa. FINDINGS: For the assessment of the intra-rater reliability we included 154 records (80 for rater 1; 74 for rater 2). For the inter-rater reliability we could include 70 records. Intra-rater reliability was substantial to excellent (Cohen’s kappa 0-6-0.8) with an observed percentage agreement of 75%-95%. In all variables learning effects were observed. Inter-rater reliability was substantial to excellent (Cohen’s kappa 0.70-0.83) with high agreement ranging from 86% to 100%. CONCLUSIONS: Our study showed that data abstracted from medical records are reliable. Investigating intra-rater and inter-rater reliability can give confidence to draw conclusions from the abstracted data and increase data quality by minimizing systematic errors.
format Online
Article
Text
id pubmed-4441480
institution National Center for Biotechnology Information
language English
publishDate 2015
publisher Public Library of Science
record_format MEDLINE/PubMed
spelling pubmed-44414802015-05-28 Intra-Rater and Inter-Rater Reliability of a Medical Record Abstraction Study on Transition of Care after Childhood Cancer Gianinazzi, Micòl E. Rueegg, Corina S. Zimmerman, Karin Kuehni, Claudia E. Michel, Gisela PLoS One Research Article BACKGROUND: The abstraction of data from medical records is a widespread practice in epidemiological research. However, studies using this means of data collection rarely report reliability. Within the Transition after Childhood Cancer Study (TaCC) which is based on a medical record abstraction, we conducted a second independent abstraction of data with the aim to assess a) intra-rater reliability of one rater at two time points; b) the possible learning effects between these two time points compared to a gold-standard; and c) inter-rater reliability. METHOD: Within the TaCC study we conducted a systematic medical record abstraction in the 9 Swiss clinics with pediatric oncology wards. In a second phase we selected a subsample of medical records in 3 clinics to conduct a second independent abstraction. We then assessed intra-rater reliability at two time points, the learning effect over time (comparing each rater at two time-points with a gold-standard) and the inter-rater reliability of a selected number of variables. We calculated percentage agreement and Cohen’s kappa. FINDINGS: For the assessment of the intra-rater reliability we included 154 records (80 for rater 1; 74 for rater 2). For the inter-rater reliability we could include 70 records. Intra-rater reliability was substantial to excellent (Cohen’s kappa 0-6-0.8) with an observed percentage agreement of 75%-95%. In all variables learning effects were observed. Inter-rater reliability was substantial to excellent (Cohen’s kappa 0.70-0.83) with high agreement ranging from 86% to 100%. CONCLUSIONS: Our study showed that data abstracted from medical records are reliable. Investigating intra-rater and inter-rater reliability can give confidence to draw conclusions from the abstracted data and increase data quality by minimizing systematic errors. Public Library of Science 2015-05-22 /pmc/articles/PMC4441480/ /pubmed/26001046 http://dx.doi.org/10.1371/journal.pone.0124290 Text en © 2015 Gianinazzi et al http://creativecommons.org/licenses/by/4.0/ This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are properly credited.
spellingShingle Research Article
Gianinazzi, Micòl E.
Rueegg, Corina S.
Zimmerman, Karin
Kuehni, Claudia E.
Michel, Gisela
Intra-Rater and Inter-Rater Reliability of a Medical Record Abstraction Study on Transition of Care after Childhood Cancer
title Intra-Rater and Inter-Rater Reliability of a Medical Record Abstraction Study on Transition of Care after Childhood Cancer
title_full Intra-Rater and Inter-Rater Reliability of a Medical Record Abstraction Study on Transition of Care after Childhood Cancer
title_fullStr Intra-Rater and Inter-Rater Reliability of a Medical Record Abstraction Study on Transition of Care after Childhood Cancer
title_full_unstemmed Intra-Rater and Inter-Rater Reliability of a Medical Record Abstraction Study on Transition of Care after Childhood Cancer
title_short Intra-Rater and Inter-Rater Reliability of a Medical Record Abstraction Study on Transition of Care after Childhood Cancer
title_sort intra-rater and inter-rater reliability of a medical record abstraction study on transition of care after childhood cancer
topic Research Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4441480/
https://www.ncbi.nlm.nih.gov/pubmed/26001046
http://dx.doi.org/10.1371/journal.pone.0124290
work_keys_str_mv AT gianinazzimicole intraraterandinterraterreliabilityofamedicalrecordabstractionstudyontransitionofcareafterchildhoodcancer
AT rueeggcorinas intraraterandinterraterreliabilityofamedicalrecordabstractionstudyontransitionofcareafterchildhoodcancer
AT zimmermankarin intraraterandinterraterreliabilityofamedicalrecordabstractionstudyontransitionofcareafterchildhoodcancer
AT kuehniclaudiae intraraterandinterraterreliabilityofamedicalrecordabstractionstudyontransitionofcareafterchildhoodcancer
AT michelgisela intraraterandinterraterreliabilityofamedicalrecordabstractionstudyontransitionofcareafterchildhoodcancer
AT intraraterandinterraterreliabilityofamedicalrecordabstractionstudyontransitionofcareafterchildhoodcancer