Cargando…

Kappa coefficient: a popular measure of rater agreement

In mental health and psychosocial studies it is often necessary to report on the between-rater agreement of measures used in the study. This paper discusses the concept of agreement, highlighting its fundamental difference from correlation. Several examples demonstrate how to compute the kappa coeff...

Descripción completa

Detalles Bibliográficos
Autores principales: TANG, Wan, HU, Jun, ZHANG, Hui, WU, Pan, HE, Hua
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Shanghai Municipal Bureau of Publishing 2015
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4372765/
https://www.ncbi.nlm.nih.gov/pubmed/25852260
http://dx.doi.org/10.11919/j.issn.1002-0829.215010
Descripción
Sumario:In mental health and psychosocial studies it is often necessary to report on the between-rater agreement of measures used in the study. This paper discusses the concept of agreement, highlighting its fundamental difference from correlation. Several examples demonstrate how to compute the kappa coefficient – a popular statistic for measuring agreement – both by hand and by using statistical software packages such as SAS and SPSS. Real study data are used to illustrate how to use and interpret this coefficient in clinical research and practice. The article concludes with a discussion of the limitations of the coefficient.