Cargando…
Interrater reliability estimators tested against true interrater reliabilities
BACKGROUND: Interrater reliability, aka intercoder reliability, is defined as true agreement between raters, aka coders, without chance agreement. It is used across many disciplines including medical and health research to measure the quality of ratings, coding, diagnoses, or other observations and...
Autores principales: | Zhao, Xinshu, Feng, Guangchao Charles, Ao, Song Harris, Liu, Piper Liping |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
BioMed Central
2022
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9426226/ https://www.ncbi.nlm.nih.gov/pubmed/36038846 http://dx.doi.org/10.1186/s12874-022-01707-5 |
Ejemplares similares
-
Interrater reliability: the kappa statistic
por: McHugh, Mary L.
Publicado: (2012) -
Interrater Reliability of the Prone Apprehension Relocation Test
por: Watchmaker, Lauren E., et al.
Publicado: (2021) -
Interrater reliability in the assessment of physiotherapy students
por: Gittinger, Flora P., et al.
Publicado: (2022) -
Interrater reliability of clinical tests to evaluate scapulothoracic motion
por: Baertschi, Evelyn, et al.
Publicado: (2013) -
Interrater Reliability of Motion Palpation in the Thoracic Spine
por: Walker, Bruce F., et al.
Publicado: (2015)