Cargando…
Kappa coefficient: a popular measure of rater agreement
In mental health and psychosocial studies it is often necessary to report on the between-rater agreement of measures used in the study. This paper discusses the concept of agreement, highlighting its fundamental difference from correlation. Several examples demonstrate how to compute the kappa coeff...
Autores principales: | TANG, Wan, HU, Jun, ZHANG, Hui, WU, Pan, HE, Hua |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Shanghai Municipal Bureau of Publishing
2015
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4372765/ https://www.ncbi.nlm.nih.gov/pubmed/25852260 http://dx.doi.org/10.11919/j.issn.1002-0829.215010 |
Ejemplares similares
-
Correlation and agreement: overview and clarification of competing concepts and measures
por: LIU, Jinyuan, et al.
Publicado: (2016) -
Relationships among three popular measures of differential risks: relative risk, risk difference, and odds ratio
por: FENG, Changyong, et al.
Publicado: (2016) -
Sample sizes based on three popular indices of risks
por: Wang, Hongyue, et al.
Publicado: (2018) -
Interrater reliability: the kappa statistic
por: McHugh, Mary L.
Publicado: (2012) -
Partial least squares regression and principal component analysis: similarity and differences between two popular variable reduction approaches
por: Liu, Chenyu, et al.
Publicado: (2022)